| Crates.io | lazyllama |
| lib.rs | lazyllama |
| version | 0.1.0 |
| created_at | 2026-01-11 02:36:48.247973+00 |
| updated_at | 2026-01-11 02:36:48.247973+00 |
| description | A lightweight TUI client for Ollama with markdown support and smart scrolling. |
| homepage | https://github.com/Pommersche92/lazyllama |
| repository | https://github.com/Pommersche92/lazyllama |
| max_upload_size | |
| id | 2035119 |
| size | 96,102 |
LazyLlama is a lightweight, fast Terminal User Interface (TUI) client for Ollama. It is designed for running local AI models with minimal overhead and intuitive, Emacs-inspired controls directly in your terminal.
AUTOSCROLL: Automatically follows the AI output.
MANUAL SCROLL: Locks the view (🔒) when you use PageUp/Down, allowing you to read previous messages undisturbed.~/.local/share/lazyllama/.git clone [https://github.com/Pommersche92/lazyllama.git](https://github.com/Pommersche92/lazyllama.git)
cd lazyllama
2. Install it system-wide:
```bash
cargo install --path .
| Key | Action |
|---|---|
Enter |
Send message / Re-activate Autoscroll |
C-q |
Quit application safely |
C-c |
Clear chat history |
C-s |
Manually toggle Autoscroll |
↑ / ↓ |
Select AI Model |
PgUp / PgDn |
Scroll history (activates Manual Mode) |
The project follows a modular design for easy maintainability:
main.rs: Entry point and terminal event handling.app.rs: State management and Ollama API integration.ui.rs: Rendering logic and Markdown parsing.utils.rs: File system operations and session logging.You can generate the full technical documentation locally:
cargo doc --no-deps --open
This project is licensed under the GPL-2.0-or-later. See the LICENSE file for details.
Developed with ❤️ in the black forest.