llama-desktop

Crates.iollama-desktop
lib.rsllama-desktop
version2.2.10
sourcesrc
created_at2024-01-24 04:57:13.732521
updated_at2024-11-19 21:57:49.780339
descriptionDesktop interface for Ollama
homepagehttps://crates.io/crates/llama-desktop
repositoryhttps://github.com/cacilhas/llama-desktop
max_upload_size
id1111685
size1,140,571
Montegasppα ℭacilhας (cacilhas)

documentation

README

Llamma Desktop

Llama

Desktop app to connect to Ollama and send queries.

Llama Desktop reads the Ollama service URI from the environment variable OLLAMA_HOST, defaults to http://localhost:11434.

Installation

Ollama

In case you have an NVIDIA GPU and want to run Ollama locally:

curl -fsSL https://ollama.com/install.sh | sh
systemctl enable ollama
systemctl start ollama
ollama pull mistral:latest
ollama pull phind-codellama:latest

Last stable release

cargo install llama-desktop

Development version

cargo install git@github.com:cacilhas/llama-desktop.git

License

Commit count: 143

cargo fmt