| Crates.io | spec-ai |
| lib.rs | spec-ai |
| version | 0.6.0-prerelease.12 |
| created_at | 2025-11-12 18:06:47.35946+00 |
| updated_at | 2026-01-04 05:08:25.718541+00 |
| description | A framework for building AI agents with structured outputs, policy enforcement, and execution tracking |
| homepage | |
| repository | https://github.com/geoffsee/spec-ai |
| max_upload_size | |
| id | 1929713 |
| size | 225,439 |
Public library crate for the spec-ai framework.
This is the main library crate that re-exports all public APIs from the spec-ai workspace crates. It provides a unified interface for building AI agent applications.
openai - OpenAI API integrationlmstudio - LM Studio local modelsweb-scraping - Web scraping capabilitiesvttrs - Video/subtitle processingapi - HTTP API servercli - Command-line interfaceLLM Providers:
anthropic - Anthropic Claude APIollama - Ollama local modelsmlx - Apple MLX frameworkDatabase:
bundled - Bundled DuckDB library (recommended)duck-sys - System DuckDB libraryOther:
integration-tests - Enable integration testsaxum-extra - Additional Axum web framework featuresAdd to your Cargo.toml:
[dependencies]
spec-ai = "0.5"
Or install the CLI:
cargo install spec-ai --features bundled
use spec_ai::prelude::*;
// Your agent application code here
This crate also provides the spec-ai binary:
# Start interactive session
spec-ai
# Run a spec file
spec-ai run task.spec
# Use custom config
spec-ai --config custom.toml
This crate re-exports functionality from:
For detailed documentation, see:
docs/ARCHITECTURE.md - System architecturedocs/CONFIGURATION.md - Configuration guidedocs/SETUP.md - Setup instructionsExample configurations and code can be found in the repository:
examples/configs/ - Configuration examplesexamples/code/ - Code examplesspecs/ - Example spec filesMIT License - see LICENSE or the main README for details.
Create an issue or open a PR at the spec-ai repository.