| Crates.io | bashelp |
| lib.rs | bashelp |
| version | 0.1.0 |
| created_at | 2026-01-05 20:29:17.656522+00 |
| updated_at | 2026-01-05 20:29:17.656522+00 |
| description | Natural language to shell commands. Local-first, provider agnostic. |
| homepage | |
| repository | https://github.com/sqrew/bashelp |
| max_upload_size | |
| id | 2024524 |
| size | 100,969 |
Natural language to shell commands. Local-first, provider agnostic.
$ bashelp find all rust files modified this week
→ find . -name "*.rs" -mtime -7
[Enter to run, 'c' to copy, 'e' to edit, 'q' to quit]:
Most AI shell assistants require cloud API keys. bashelp is local-first — it works with ollama out of the box, keeping your data on your machine.
cargo install bashelp
Or build from source:
git clone https://github.com/sqrew/bashelp
cd bashelp
cargo build --release
Install ollama (if you haven't): https://ollama.ai
Pull a model:
ollama pull llama3
Set your default model:
bashelp use llama3
Ask for help:
bashelp find files larger than 100mb
bashelp <query> Ask for a shell command (no quotes needed!)
bashelp use <model> Set default model
bashelp config init Create config file
bashelp config show Show current config
bashelp --help Show all options
| Flag | Description |
|---|---|
-y, --yes |
Skip confirmation, run immediately |
-e, --explain |
Explain a command instead of generating one |
-m, --model |
Override model for this query |
-p, --provider |
Override provider for this query |
-v, --verbose |
Show debug info |
--dry-run |
Show command but don't execute |
# Generate a command (no quotes needed!)
bashelp compress this folder
# Run without confirmation
bashelp -y update system packages
# Explain a command you don't understand
bashelp --explain "tar -xzvf"
# Use a specific model for one query
bashelp -m mistral disk usage by folder
# Use a different provider
bashelp -p groq -m llama-3.3-70b-versatile list docker containers
| Provider | Aliases | Default Endpoint |
|---|---|---|
| ollama | - | http://localhost:11434 |
| Provider | Aliases | Models |
|---|---|---|
| claude | anthropic |
claude-3-5-haiku-20241022, claude-3-5-sonnet-20241022, etc. |
| openai | chatgpt, gpt |
gpt-4o, gpt-4o-mini, etc. |
| gemini | google |
gemini-1.5-flash, gemini-1.5-pro, etc. |
| grok | xai |
grok-2, etc. |
| groq | - | llama-3.3-70b-versatile, mixtral-8x7b-32768, etc. |
| mistral | - | mistral-large-latest, mistral-small-latest, etc. |
| perplexity | pplx |
llama-3.1-sonar-small-128k-online, etc. |
| together | - | meta-llama/Llama-3-70b-chat-hf, etc. |
| fireworks | - | accounts/fireworks/models/llama-v3-70b-instruct, etc. |
| deepseek | - | deepseek-chat, deepseek-coder, etc. |
| openrouter | - | Any model available on OpenRouter |
| openai-compatible | custom |
Any OpenAI-compatible API (bring your own endpoint) |
Config lives at ~/.config/bashelp/config.toml:
[provider]
name = "ollama"
model = "llama3"
endpoint = "http://localhost:11434"
# api_key = "your-key-here" # for cloud providers
[behavior]
confirm = true
dangerous_warn = true
# Claude
bashelp config set provider.name claude
bashelp config set provider.api_key sk-ant-...
bashelp use claude-3-5-haiku-20241022
# OpenAI
bashelp config set provider.name openai
bashelp config set provider.api_key sk-...
bashelp use gpt-4o-mini
# Groq (fast & free tier!)
bashelp config set provider.name groq
bashelp config set provider.api_key gsk_...
bashelp use llama-3.3-70b-versatile
# Gemini
bashelp config set provider.name gemini
bashelp config set provider.api_key ...
bashelp use gemini-1.5-flash
# Custom OpenAI-compatible endpoint
bashelp config set provider.name openai-compatible
bashelp config set provider.endpoint https://your-api.com/v1/chat/completions
bashelp config set provider.api_key your-key
bashelp use your-model
MIT
PRs welcome! This project is built with love and Rust.
Made by sqrew with help from Claude. 🦀