ratatalk

Crates.ioratatalk
lib.rsratatalk
version0.1.2
created_at2025-12-27 11:05:35.446257+00
updated_at2026-01-07 21:24:09.608428+00
descriptionA terminal chat client for Ollama, built with Rust and ratatui
homepage
repositoryhttps://github.com/mohammad-albarham/ratatalk
max_upload_size
id2007047
size660,954
MOHAMMAD ALBARHAM (mohammad-albarham)

documentation

README

Ratatalk 🦀💬

A terminal chat client for Ollama, built with Rust and ratatui.

Ratatalk Screenshot

Features

  • 🚀 Fast & Responsive: Non-blocking UI with streamed responses
  • 💬 Multi-Session: Manage multiple named chat conversations
  • 💾 Persistent: Sessions auto-save across restarts
  • ⌨️ Keyboard-Driven: Vim-inspired keybindings
  • 🎨 Beautiful TUI: Clean, modern terminal interface
  • ⚙️ Configurable: TOML-based configuration

Installation

Prerequisites

Install via crates.io (recommended)

The easiest way to get started is to install Ratatalk as a global CLI tool:

cargo install ratatalk

This downloads the latest version from crates.io and installs it to your ~/.cargo/bin directory. After installation, ratatalk will be available in your PATH, so you can run it from any directory:

ratatalk

Screenshot from the app:

Ratatalk in action

Run from source (for development)

If you want to hack on Ratatalk or try the latest features from the repository:

git clone https://github.com/mohammad-albarham/ratatalk.git
cd ratatalk

Then run it in development mode:

cargo run

Or with optimizations (recommended for better performance):

cargo run --release

Optional: Handy shell alias

If you frequently run from source, you can add a shell alias to ~/.bashrc or ~/.zshrc:

alias rt='cargo run --release'

Reload your shell configuration to apply it:

source ~/.zshrc  # or source ~/.bashrc

Now you can simply type rt to run Ratatalk:

rt

Usage

Running Ollama

Ratatalk expects a local Ollama server running on http://127.0.0.1:11434 (the default). Start Ollama in a terminal:

ollama serve

Starting Ratatalk

In another terminal, run Ratatalk:

ratatalk

If you installed via crates.io, just use ratatalk. If you're running from source with the shell alias, use rt.

Keybindings

General

Key Action
q / Ctrl+c Quit
? Toggle help
Ctrl+r Refresh models

Navigation

Key Action
Tab Next session
Shift+Tab Previous session
Ctrl+n New session
Ctrl+w Delete session
m Select model

Chat

Key Action
i / Enter Start typing
Esc Stop typing
Enter Send message (while typing)

Scrolling

Key Action
j / Scroll down
k / Scroll up
Ctrl+d Page down
Ctrl+u Page up
g Scroll to top
G Scroll to bottom

Input Editing

Key Action
Ctrl+a Move to start of line
Ctrl+e Move to end of line
Ctrl+u Clear input
Ctrl+w Delete word

Configuration

Configuration is stored at ~/.config/ratatalk/config.toml:

[server]
host = "http://127.0.0.1:11434"
timeout_secs = 30

[model]
default_model = "llama3.2:latest"
temperature = 0.7
top_k = 40
top_p = 0.9
max_tokens = 0  # 0 = unlimited

[ui]
show_timestamps = true
show_token_count = true
sidebar_width = 30
mouse_support = true
tick_rate_ms = 100

[keybindings]
vim_mode = false

Data Storage

  • Config: ~/.config/ratatalk/config.toml
  • Sessions: ~/.local/share/ratatalk/sessions.json
  • Logs: ~/.config/ratatalk/ratatalk.log

Architecture

src/
├── main.rs           # Entry point, terminal setup, main loop
├── app.rs            # Application state, events, actions
├── config.rs         # Configuration management
├── error.rs          # Error types
├── events.rs         # Input handling, keybindings
├── persistence.rs    # Session save/load
├── ollama/
│   ├── mod.rs        # Module exports
│   ├── client.rs     # HTTP client
│   └── types.rs      # API types
└── ui/
    ├── mod.rs        # UI module, colors, styles
    ├── layout.rs     # Screen layout
    ├── chat.rs       # Chat area rendering
    ├── input.rs      # Input box rendering
    ├── sidebar.rs    # Session/model sidebar
    └── popup.rs      # Modal dialogs

Roadmap

MVP ✅

  • Connect to local Ollama
  • List available models
  • Chat with streaming responses
  • Multiple chat sessions
  • Persistent config & sessions
  • Basic keybindings

Future

  • System prompts per session
  • Adjustable model parameters per session
  • SQLite backend for history
  • Export chat to Markdown
  • Token/latency statistics
  • Vim-style keybindings
  • Search within chat
  • RAG support with local files
  • Image/multimodal support

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see LICENSE for details.

Acknowledgments

  • Ollama for the excellent local LLM server
  • ratatui for the amazing TUI framework
  • oterm for inspiration
Commit count: 13

cargo fmt