cargo-ox

Crates.iocargo-ox
lib.rscargo-ox
version0.1.0
created_at2025-11-29 15:38:53.532895+00
updated_at2025-11-29 15:38:53.532895+00
descriptionCargo Oxide is a CLI tool for AI-powered Rust development.
homepagehttps://github.com/vladneyo/cargo-ox
repositoryhttps://github.com/vladneyo/cargo-ox
max_upload_size
id1956853
size58,385
Vladyslav Neichev (vladneyo)

documentation

README

cargo-ox 🐂

cargo-ox is a Cargo subcommand that uses a local LLM (via Ollama) to help you develop Rust code. It can explain compilation errors and suggest refactors.

Prerequisites

  • Rust: Ensure you have Rust and Cargo installed.
  • Ollama: You need Ollama installed and running locally.

Setup

We provide a setup script to automatically install Ollama (if missing), start the server, and pull the required model.

  1. Run the setup script:
    ./setup_ollama.sh
    
    This will prepare your environment and ensure the default model (gpt-oss:20b) is available.

Installation

To install cargo-ox locally:

cargo install --path .

Usage

Explain Compilation Errors

Run cargo ox explain to run cargo check, capture any errors, and get an AI explanation with fix suggestions.

cargo ox explain

You can also run it on a specific project directory:

cargo ox explain --project ../my-other-project

Refactor Code

Run cargo ox refactor to get AI suggestions for improving a specific file.

cargo ox refactor --file src/main.rs

Configuration

By default, cargo-ox uses the gpt-oss:20b model. You can override this by setting the OX_MODEL environment variable:

export OX_MODEL=llama3
cargo ox explain
Commit count: 0

cargo fmt