| Crates.io | structured-json-agent |
| lib.rs | structured-json-agent |
| version | 0.1.0 |
| created_at | 2026-01-22 13:32:42.886818+00 |
| updated_at | 2026-01-22 13:32:42.886818+00 |
| description | A typed and extensible library for creating and running Iterative AI Agents that generate structured JSON output. |
| homepage | |
| repository | https://github.com/thiagoaramizo/structured-json-agent-rust |
| max_upload_size | |
| id | 2061587 |
| size | 76,044 |
A typed and extensible Rust library for creating and running Iterative AI Agents that guarantee structured JSON output.
This library orchestrates a Generator ↔ Reviewer cycle to ensure that the output from Large Language Models (LLMs) strictly adheres to a defined JSON Schema.
async-openai), but extensible for other providers.Add this to your Cargo.toml:
[dependencies]
structured-json-agent = "0.1.0"
serde_json = "1.0"
tokio = { version = "1.0", features = ["full"] }
use structured_json_agent::{StructuredAgent, StructuredAgentConfig, OpenAIService};
use serde_json::json;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// 1. Define your Schemas
let input_schema = json!({
"type": "object",
"properties": {
"topic": { "type": "string" },
"depth": { "type": "string", "enum": ["basic", "advanced"] }
},
"required": ["topic", "depth"]
});
let output_schema = json!({
"type": "object",
"properties": {
"title": { "type": "string" },
"keyPoints": { "type": "array", "items": { "type": "string" } },
"summary": { "type": "string" }
},
"required": ["title", "keyPoints", "summary"]
});
// 2. Initialize the Agent
let api_key = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let llm_service = Arc::new(OpenAIService::new(api_key));
let agent = StructuredAgent::new(StructuredAgentConfig {
llm_service,
generator_model: "gpt-4-turbo".to_string(),
reviewer_model: "gpt-3.5-turbo".to_string(), // Can be a faster/cheaper model for simple fixes
input_schema,
output_schema,
system_prompt: "You are an expert summarizer. Create a structured summary based on the topic.".to_string(),
max_iterations: Some(3), // Optional: Max correction attempts (default: 5)
})?;
// 3. Run the Agent
let result = agent.run(json!({
"topic": "Clean Architecture",
"depth": "advanced"
})).await?;
println!("Result: {}", serde_json::to_string_pretty(&result)?);
// Output is guaranteed to match outputSchema
Ok(())
}
input_schema.generator_model creates an initial response based on the system prompt and input.output_schema.reviewer_model is invoked with the invalid JSON, the specific validation errors, and the expected schema. It attempts to fix the JSON.max_iterations is reached.You can implement the LLMService trait to support other LLM providers (Anthropic, Gemini, Local Models, etc.).
use async_trait::async_trait;
use structured_json_agent::{LLMService, AgentError};
pub struct MyCustomLLM;
#[async_trait]
impl LLMService for MyCustomLLM {
async fn chat_completion(&self, model: &str, system_prompt: &str, user_prompt: &str) -> Result<String, AgentError> {
// Implement your custom logic here
Ok("{}".to_string())
}
}
MIT