| Crates.io | chatti |
| lib.rs | chatti |
| version | 0.1.0 |
| created_at | 2024-10-10 07:34:30.377838+00 |
| updated_at | 2024-10-10 07:34:30.377838+00 |
| description | Terminal-based chat application that interfaces with Ollama |
| homepage | https://github.com/nubilfi/chatti |
| repository | https://github.com/nubilfi/chatti |
| max_upload_size | |
| id | 1403518 |
| size | 127,048 |
Chatti is a terminal-based chat application that interfaces with Ollama, providing a unique way to interact with Ollama's language models directly from your command line.
Before you begin, ensure you have the following installed:
Chatti uses a configuration file located at ~/.config/chatti/config.toml. If this file doesn't exist, the application will create a default one on first run. You can edit this file to change the following settings:
api_endpoint = "http://localhost:11434/api/chat"
model = "llama3.2"
stream = true
temperature = 0.7
api_endpoint: The URL of your Ollama API endpointmodel: The Ollama model you want to usestream: Whether to use streaming responses (recommended)temperature: The temperature parameter for text generation (0.0 to 1.0)To start the application, run:
cargo run
Once the application starts:
To run tests:
cargo test
Contributions are welcome! Please feel free to submit a Pull Request.