Crates.io | llama_cpp_low |
lib.rs | llama_cpp_low |
version | 0.3.14 |
source | src |
created_at | 2024-05-05 01:26:42.491115 |
updated_at | 2024-09-06 04:21:56.275395 |
description | small server binary compile build from llama.cpp |
homepage | |
repository | https://github.com/blmarket/llm-daemon |
max_upload_size | |
id | 1229960 |
size | 14,984,909 |
Script to build llama.cpp server binary using cargo
I just wanted to have the daemon to run the LLM with minimal external dependency...