| Crates.io | llama_cpp_low |
| lib.rs | llama_cpp_low |
| version | 0.7.0 |
| created_at | 2024-05-05 01:26:42.491115+00 |
| updated_at | 2025-09-06 15:15:10.352422+00 |
| description | small server binary compile build from llama.cpp |
| homepage | |
| repository | https://github.com/blmarket/llm-daemon |
| max_upload_size | |
| id | 1229960 |
| size | 38,905,892 |
Script to build llama.cpp server binary using cargo
I just wanted to have the daemon to run the LLM with minimal external dependency...