Crates.io | llama-desktop |
lib.rs | llama-desktop |
version | 2.2.10 |
source | src |
created_at | 2024-01-24 04:57:13.732521 |
updated_at | 2024-11-19 21:57:49.780339 |
description | Desktop interface for Ollama |
homepage | https://crates.io/crates/llama-desktop |
repository | https://github.com/cacilhas/llama-desktop |
max_upload_size | |
id | 1111685 |
size | 1,140,571 |
Desktop app to connect to Ollama and send queries.
Llama Desktop reads the Ollama service URI from the environment variable
OLLAMA_HOST
, defaults to http://localhost:11434
.
In case you have an NVIDIA GPU and want to run Ollama locally:
curl -fsSL https://ollama.com/install.sh | sh
systemctl enable ollama
systemctl start ollama
ollama pull mistral:latest
ollama pull phind-codellama:latest
cargo install llama-desktop
cargo install git@github.com:cacilhas/llama-desktop.git