| Crates.io | ai-context-gen |
| lib.rs | ai-context-gen |
| version | 0.1.2 |
| created_at | 2025-07-10 02:19:35.326369+00 |
| updated_at | 2025-07-10 14:22:05.394052+00 |
| description | A context generator for Rust repositories that creates structured markdown files with relevant information for LLMs and AI agents |
| homepage | https://github.com/brbtavares/ai-context-gen |
| repository | https://github.com/brbtavares/ai-context-gen |
| max_upload_size | |
| id | 1745744 |
| size | 128,451 |
[![Crates.i```toml
[dependencies] ai-context-gen = "0.1.2"
[](https://docs.rs/ai-context-gen)
[](LICENSE-MIT)
[](https://www.rust-lang.org)
[](https://github.com/brbtavares/ai-context-gen/actions)
[](https://crates.io/crates/ai-context-gen)
A context generator for Rust repositories that creates a structured markdown file with relevant information for LLMs and AI agents.
## ๐ฏ Quick Start
**Choose your preferred way to use AI Context Generator:**
| Usage Mode | When to Use | Quick Start |
|------------|-------------|-------------|
| ๐ง **CLI Tool** | Interactive use, one-time analysis, scripts | `ai-context-gen --path ./my-project` |
| ๐ **Rust Library** | Integrate into Rust apps, custom workflows | `cargo add ai-context-gen` |
---
### ๐ CLI Quick Start
```bash
# Install globally
git clone https://github.com/brbtavares/ai-context-gen
cd ai-context-gen && make install
# Use anywhere
ai-context-gen --path /path/to/project
# Cargo.toml
[dependencies]
ai-context-gen = "0.1.1"
use ai_context_gen::generate_context;
use std::path::PathBuf;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
generate_context(PathBuf::from("."), "context.md".to_string()).await?;
Ok(())
}
.rs and .md files in the repositoryThe AI Context Generator CLI is perfect for interactive use, one-time analysis, and shell scripts.
# Clone the repository
git clone https://github.com/brbtavares/ai-context-gen
cd ai-context-gen
# Build and install globally (recommended)
make install
# Alternative: step by step
make build
sudo cp target/release/ai-context-gen /usr/local/bin/
# Check if installed correctly
ai-context-gen --version
ai-context-gen --help
# Should work from any directory
cd /tmp && ai-context-gen --path ~/my-project
# Development & Testing
make dev # Build and run in development mode
make demo # Run demo with current directory
make test # Run tests
make check # Run format, lint and tests
# Build & Installation
make build # Build using script (recommended)
make install # Install on system
make uninstall # Remove from system
# Utilities
make clean # Clean build artifacts
make help-make # Show all make commands
# Analyze current directory (interactive mode)
ai-context-gen
# Analyze specific directory
ai-context-gen --path /path/to/project
# Custom output file
ai-context-gen --output my_context.md
# High token limit for large projects
ai-context-gen --max-tokens 100000
ai-context-gen [OPTIONS]
Options:
-p, --path <PATH> Path to repository (default: current directory)
-m, --max-tokens <MAX_TOKENS> Maximum number of tokens (default: 50000)
-o, --output <OUTPUT> Output file name (default: repo_context.md)
--include-hidden Include hidden files and directories
--include-deps Include external dependencies analysis
-h, --help Print help
-V, --version Print version
# Complete analysis with all options
ai-context-gen --path ~/my-rust-project --max-tokens 200000 --output complete_analysis.md --include-hidden
# Quick summary
ai-context-gen --max-tokens 10000 --output summary.md
# Analyze remote/different project
ai-context-gen --path /opt/some-project --output /tmp/analysis.md
The AI Context Generator library is perfect for integrating context generation into your Rust applications.
Add to your Cargo.toml:
[dependencies]
ai-context-gen = "0.1.2"
use ai_context_gen::generate_context;
use std::path::PathBuf;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Generate context for current directory
generate_context(PathBuf::from("."), "context.md".to_string()).await?;
println!("Context generated in context.md");
Ok(())
}
use ai_context_gen::{Config, ContextGenerator, RepositoryScanner};
use std::path::PathBuf;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Custom configuration
let config = Config {
repo_path: PathBuf::from("./my-project"),
max_tokens: 100000,
output_file: "detailed_context.md".to_string(),
include_hidden: true,
include_deps: true,
};
// Two-step process for more control
let scanner = RepositoryScanner::new(config.clone());
let scan_result = scanner.scan().await?;
println!("Files found: {}", scan_result.files.len());
let generator = ContextGenerator::new(config);
generator.generate_context(scan_result).await?;
println!("Context generated successfully!");
Ok(())
}
use ai_context_gen::{Config, generate_context_with_config};
use std::path::PathBuf;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let config = Config {
repo_path: PathBuf::from("/path/to/analyze"),
max_tokens: 75000,
output_file: "custom_context.md".to_string(),
include_hidden: false,
include_deps: true,
};
generate_context_with_config(config).await?;
Ok(())
}
generate_context(path, output): Simple function for basic casesgenerate_context_with_config(config): Function with custom configurationConfig: Configuration structureRepositoryScanner: File scanning and analysisContextGenerator: Context generation with prioritiesRustParser: Rust code AST parserThe generated file contains the following sections (in priority order):
The system uses an intelligent prioritization algorithm:
When the token limit is reached, the system:
The system automatically ignores:
Directories:
target/node_modules/.git/.vscode/.idea/Files:
Cargo.lock.gitignore.DS_StoreUses the GPT-4 tokenizer for precise token counting, ensuring compatibility with:
.rs and .md filesContributions are welcome! Please:
git clone https://github.com/brbtavares/ai-context-gen
cd ai-context-gen
cargo build
cargo test
For maintainers, releases are automated. See RELEASE.md for details.
# Update version and changelog, then:
git tag v0.1.2
git push origin v0.1.2
# GitHub Actions handles the rest!
This project is licensed under the MIT license. See the LICENSE file for details.
# Check if /usr/local/bin is in your PATH
echo $PATH | grep -o '/usr/local/bin'
# If not found, add to your shell profile
echo 'export PATH="/usr/local/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc
# Verify installation
which ai-context-gen
ai-context-gen --version
# Make sure you have sudo privileges
sudo make install
# Or install manually
make build
sudo cp target/release/ai-context-gen /usr/local/bin/
sudo chmod +x /usr/local/bin/ai-context-gen
# Remove old installations
rm -f ~/.local/bin/ai-context-gen
sudo rm -f /usr/local/bin/ai-context-gen
# Reinstall fresh
make clean
make install
Make sure you're using #[tokio::main] or initializing a runtime:
// Option 1: Use tokio::main
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// your code here
}
// Option 2: Manual runtime
fn main() -> anyhow::Result<()> {
let rt = tokio::runtime::Runtime::new()?;
rt.block_on(async {
// your async code here
})
}
use ai_context_gen::{Config, generate_context_with_config};
use std::path::PathBuf;
// Make sure output directory is writable
let config = Config {
repo_path: PathBuf::from("./my-project"),
output_file: "/tmp/context.md".to_string(), // Use temp dir if needed
// ... other config
};
# Use higher token limits for large projects
ai-context-gen --path ./large-project --max-tokens 200000
# Or focus on specific parts
ai-context-gen --path ./large-project/src --max-tokens 50000
# Include hidden files
ai-context-gen --include-hidden
# For library usage, modify Config
let config = Config {
include_hidden: true,
include_deps: true,
// ...
};