Crates.io | shlf |
lib.rs | shlf |
version | |
source | src |
created_at | 2024-09-17 22:51:18.582343 |
updated_at | 2024-12-09 06:10:16.913753 |
description | AI-based command-line tools for developers |
homepage | |
repository | https://github.com/ab22593k/shelf |
max_upload_size | |
id | 1378417 |
Cargo.toml error: | TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
Shelf is a command-line tool for bookmarking configuration files in the system, generating git commit messages and reviewing code using AI.
It provides a simple interface to track files across your system and integrates with multiple AI providers to automatically generate meaningful commit messages through git hooks and perform comprehensive code reviews. Shelf makes configuration files management, git commits messages and code reviews effortless.
To install Shelf, you need to have Rust and Cargo installed on your system. If you don't have them, you can install them from rustup.rs.
Once you have Rust and Cargo installed, you can build and install Shelf using the following command:
cargo install --path .
Shelf provides commands for both dotfile management and git integration:
# Add a new dotfile to track
shelf bo tarck ~/.bashrc
# List all tracked dotfiles
shelf bo list
# Remove a dotfile from tracking
shelf bo untarck ~/.bashrc
# Interactive selection of dotfiles to track
shelf bo suggest -i
# Show help
shelf --help
Each command can be run with -h
or --help
for more information.
The ai
subcommand provides AI-powered features:
# Configure AI provider
shelf ai config set provider openai
shelf ai config set openai_api_key "your-api-key"
# Use specific provider for one commit
shelf ai commit -p openai
# List current configuration
shelf ai config list
The AI-powered features support diffrent AI providers:
The git hook integrates seamlessly with your normal git workflow:
# Hook will automatically generate message if none provided
git commit
# Your message takes precedence
git commit -m "feat: your message"
# AI helps with amending
git commit --amend
Shelf can assist in code review by analyzing pull requests and providing AI-powered feedback:
# Review the current staged branch's changes
shelf ai review
The AI review provides:
If you're upgrading from a v0.8.7 version of Shelf, here are the key changes and migration steps:
# Migration hints
shelf migrate
# Apply changes
shelf migrate --fix
Prompt templates for commit messages and code reviews are stored in $XDG_CONFIG_HOME/shelf
(or ~/.config/shelf
if $XDG_CONFIG_HOME
is not set).
You can customize these templates to tailor the AI's output to your specific needs.
Shelf supports generating shell completion scripts for various shells. You can generate these
scripts using the completion
subcommand:
# Generate completion script for Bash
shelf completion bash > shelf.bash
# Generate completion script for Zsh
shelf completion zsh > _shelf
# Generate completion script for Fish
shelf completion fish > shelf.fish
To use the completion scripts:
For Bash, add the following line to your ~/.bashrc
:
source /path/to/shelf.bash
For Zsh, place the _shelf
file in ~/.zfunc
, then add source ~/.zfunc/_shelf
in ~/.zshrc
.
For Fish, place the shelf.fish
file in ~/.config/fish/completions
.
After setting up the completion script, restart your shell or source the respective configuration file to enable completions for the shelf
command.
AI settings are stored in ~/.config/shelf/ai.json
(or $XDG_CONFIG_HOME/shelf/ai.json
if set). You can configure:
provider
: AI provider to use (openai
, anthropic
, gemini
, groq
, xai
and ollama
)model
: Ollama model to use (default: qwen2.5-coder
)openai_api_key
: OpenAI API key for GPT modelsollama_host
: Ollama server URL (default: http://localhost:11434
)Example configuration:
{
"provider": "ollama",
"model": "qwen2.5-coder",
"ollama_host": "http://localhost:11434" # Only if you are using custom host,
}
To build the project locally:
cargo build
To run tests:
cargo test
To run the project directly without installing:
cargo run --bin shelf -- [SUBCOMMAND]
Replace [SUBCOMMAND]
with the command you want to run, such as bo
or ai
.
Contributions are welcome! Please feel free tor submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.