llama_cpp_low

Crates.iollama_cpp_low
lib.rsllama_cpp_low
version0.3.14
sourcesrc
created_at2024-05-05 01:26:42.491115
updated_at2024-09-06 04:21:56.275395
descriptionsmall server binary compile build from llama.cpp
homepage
repositoryhttps://github.com/blmarket/llm-daemon
max_upload_size
id1229960
size14,984,909
Jeong, Heon (blmarket)

documentation

README

llama-cpp-low

Script to build llama.cpp server binary using cargo

Wait, are you sober?

I just wanted to have the daemon to run the LLM with minimal external dependency...

Commit count: 199

cargo fmt