| Crates.io | lumen |
| lib.rs | lumen |
| version | 2.19.0 |
| created_at | 2024-10-29 19:33:19.819092+00 |
| updated_at | 2026-01-22 20:07:31.866211+00 |
| description | lumen is a command-line tool that uses AI to generate commit messages, summarise git diffs or past commits, and more. |
| homepage | |
| repository | https://github.com/jnsahaj/lumen |
| max_upload_size | |
| id | 1427541 |
| size | 746,250 |
The missing code review tool in the era of AI coding agents.
Before you begin, ensure you have:
git installed on your systemlumen explain --list commandbrew install jnsahaj/lumen/lumen
[!IMPORTANT]
cargois a package manager forrust, and is installed automatically when you installrust. See installation guide
cargo install lumen
If you want to use AI-powered features (explain, draft, list, operate), run the interactive setup:
lumen configure
This will guide you through:
The configuration is saved to ~/.config/lumen/lumen.config.json.
[!NOTE] The
diffcommand works without any configuration - it's a standalone visual diff viewer.
Create meaningful commit messages for your staged changes:
# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"
# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"
Ask Lumen to generate Git commands based on a natural language query:
lumen operate "squash the last 3 commits into 1 with the message 'squashed commit'"
# Output: git reset --soft HEAD~3 && git commit -m "squashed commit" [y/N]
The command will display an explanation of what the generated command does, show any warnings for potentially dangerous operations, and prompt for confirmation before execution.
Launch an interactive side-by-side diff viewer in your terminal:
# View uncommitted changes
lumen diff
# View changes for a specific commit
lumen diff HEAD~1
# View changes between branches
lumen diff main..feature/A
# View changes in a GitHub Pull Request
lumen diff --pr 123 # (--pr is optional)
lumen diff https://github.com/owner/repo/pull/123
# Filter to specific files
lumen diff --file src/main.rs --file src/lib.rs
# Watch mode - auto-refresh on file changes
lumen diff --watch
# Stacked mode - review commits one by one
lumen diff main..feature --stacked
Review a range of commits one at a time with --stacked:
lumen diff main..feature --stacked
lumen diff HEAD~5..HEAD --stacked
This displays each commit individually, letting you navigate through them:
ctrl+h / ctrl+l: Previous / next commit‹ / › arrows in the headerThe header shows the current commit position, SHA, and message. Viewed files are tracked per commit, so your progress is preserved when navigating.
When viewing a PR, you can mark files as viewed (syncs with GitHub) using the space keybinding.
Customize the diff viewer colors with preset themes:
# Using CLI flag
lumen diff --theme dracula
# Using environment variable
LUMEN_THEME=catppuccin-mocha lumen diff
# Or set permanently in config file (~/.config/lumen/lumen.config.json)
{
"theme": "dracula"
}
Available themes:
| Theme | Value |
|---|---|
| Default (auto-detect) | dark, light |
| Catppuccin | catppuccin-mocha, catppuccin-latte |
| Dracula | dracula |
| Nord | nord |
| One Dark | one-dark |
| Gruvbox | gruvbox-dark, gruvbox-light |
| Solarized | solarized-dark, solarized-light |
Priority: CLI flag > config file > LUMEN_THEME env var > OS auto-detect.
Add comments to hunks during code review:
i: Add/edit annotation on focused hunkI: View all annotations (edit, delete, copy, or export)Annotations can be copied to clipboard or exported to a file for sharing.
j/k or arrow keys: Navigate{/}: Jump between hunkstab: Toggle sidebarspace: Mark file as viewede: Open file in editori/I: Add annotation / view all annotationsctrl+h/l: Previous/next commit (stacked mode)?: Show all keybindingsUnderstand what changed and why:
# Explain current changes in your working directory
lumen explain # All changes
lumen explain --staged # Only staged changes
# Explain specific commits
lumen explain HEAD # Latest commit
lumen explain abc123f # Specific commit
lumen explain HEAD~3..HEAD # Last 3 commits
lumen explain main..feature/A # Branch comparison
lumen explain main...feature/A # Branch comparison (merge base)
# Ask specific questions about changes
lumen explain --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"
# Interactive commit selection
lumen explain --list # Select commit interactively
# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen explain --list
# Deprecated: lumen list (use lumen explain --list instead)
# Copy commit message to clipboard
lumen draft | pbcopy # macOS
lumen draft | xclip -selection c # Linux
# View the commit message and copy it
lumen draft | tee >(pbcopy)
# Open in your favorite editor
lumen draft | code -
# Directly commit using the generated message
lumen draft | git commit -F -
If you are using lazygit, you can add this to the user config
customCommands:
- key: '<c-l>'
context: 'files'
command: 'lumen draft | tee >(pbcopy)'
loadingText: 'Generating message...'
showOutput: true
- key: '<c-k>'
context: 'files'
command: 'lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)'
loadingText: 'Generating message...'
showOutput: true
prompts:
- type: 'input'
title: 'Context'
key: 'Context'
Configure your preferred AI provider:
# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-5-mini" draft
# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-5-mini"
| Provider | API Key Required | Models |
|---|---|---|
OpenAI openai (Default) |
Yes | gpt-5.2, gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.1, gpt-4.1-mini, o4-mini (default: gpt-5-mini) |
Claude claude |
Yes | claude-sonnet-4-5-20250930, claude-opus-4-5-20251115, claude-haiku-4-5-20251015 (default: claude-sonnet-4-5-20250930) |
Gemini gemini |
Yes (free tier) | gemini-3-pro, gemini-3-flash-preview, gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite (default: gemini-2.5-flash) |
Groq groq |
Yes (free) | llama-3.3-70b-versatile, llama-3.1-8b-instant, meta-llama/llama-4-maverick-17b-128e-instruct, openai/gpt-oss-120b (default: llama-3.3-70b-versatile) |
DeepSeek deepseek |
Yes | deepseek-chat (V3.2), deepseek-reasoner (default: deepseek-chat) |
xAI xai |
Yes | grok-4, grok-4-mini, grok-4-mini-fast (default: grok-4-mini-fast) |
OpenCode Zen opencode-zen |
Yes | see list (default: claude-sonnet-4-5) |
Ollama ollama |
No (local) | see list (default: llama3.2) |
OpenRouter openrouter |
Yes | see list (default: anthropic/claude-sonnet-4.5) |
Vercel AI Gateway vercel |
Yes | see list (default: anthropic/claude-sonnet-4.5) |
Lumen supports configuration through a JSON file. You can place the configuration file in one of the following locations:
~/.config/lumen/lumen.config.json%USERPROFILE%\.config\lumen\lumen.config.jsonLumen will load configurations in the following order of priority:
{
"provider": "openai",
"model": "gpt-5-mini",
"api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"theme": "catppuccin-mocha",
"draft": {
"commit_types": {
"docs": "Documentation only changes",
"style": "Changes that do not affect the meaning of the code",
"refactor": "A code change that neither fixes a bug nor adds a feature",
"perf": "A code change that improves performance",
"test": "Adding missing tests or correcting existing tests",
"build": "Changes that affect the build system or external dependencies",
"ci": "Changes to our CI configuration files and scripts",
"chore": "Other changes that don't modify src or test files",
"revert": "Reverts a previous commit",
"feat": "A new feature",
"fix": "A bug fix"
}
}
}
Options are applied in the following order (highest to lowest priority):
Example: Using different providers for different projects:
# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-5-mini"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"
# Override per project using config file
{
"provider": "ollama",
"model": "llama3.2"
}
# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft
Made with contrib.rocks.
Contributions are welcome! Please feel free to submit a Pull Request.