llama_cpp_low

Crates.iollama_cpp_low
lib.rsllama_cpp_low
version0.7.0
created_at2024-05-05 01:26:42.491115+00
updated_at2025-09-06 15:15:10.352422+00
descriptionsmall server binary compile build from llama.cpp
homepage
repositoryhttps://github.com/blmarket/llm-daemon
max_upload_size
id1229960
size38,905,892
Jeong, Heon (blmarket)

documentation

README

llama-cpp-low

Script to build llama.cpp server binary using cargo

Wait, are you sober?

I just wanted to have the daemon to run the LLM with minimal external dependency...

Commit count: 303

cargo fmt