llama_cpp_low

Crates.iollama_cpp_low
lib.rsllama_cpp_low
version0.6.2
created_at2024-05-05 01:26:42.491115+00
updated_at2025-08-29 23:44:01.848966+00
descriptionsmall server binary compile build from llama.cpp
homepage
repositoryhttps://github.com/blmarket/llm-daemon
max_upload_size
id1229960
size38,700,267
Jeong, Heon (blmarket)

documentation

README

llama-cpp-low

Script to build llama.cpp server binary using cargo

Wait, are you sober?

I just wanted to have the daemon to run the LLM with minimal external dependency...

Commit count: 276

cargo fmt