| Crates.io | sayr-engine |
| lib.rs | sayr-engine |
| version | 0.3.0 |
| created_at | 2025-12-24 15:15:37.935866+00 |
| updated_at | 2025-12-24 15:15:37.935866+00 |
| description | A high-performance Rust AI agent runtime inspired by the Agno framework |
| homepage | https://github.com/YASSERRMD/agno-rust |
| repository | https://github.com/YASSERRMD/agno-rust |
| max_upload_size | |
| id | 2003394 |
| size | 520,004 |
A high-performance Rust AI agent runtime inspired by the Agno framework.
SAYR Engine is an independent, community-driven Rust project inspired by the Agno framework. It is not affiliated with, endorsed by, or maintained by the Agno company or its authors.
| Provider | Default Model | Environment Variable |
|---|---|---|
| OpenAI | gpt-4 | OPENAI_API_KEY |
| Anthropic | claude-3-sonnet | ANTHROPIC_API_KEY |
| Google Gemini | gemini-pro | GOOGLE_API_KEY |
| Cohere | command-r-plus | COHERE_API_KEY |
| Groq | llama-3.3-70b-versatile | GROQ_API_KEY |
| Ollama | llama3.1 | OLLAMA_HOST (optional) |
| Mistral | mistral-large-latest | MISTRAL_API_KEY |
| Azure OpenAI | gpt-4 | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY |
| AWS Bedrock | claude-3-sonnet | AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION |
| Together AI | Llama-3.3-70B-Instruct | TOGETHER_API_KEY |
| Fireworks | llama-v3p1-70b-instruct | FIREWORKS_API_KEY |
| Category | Toolkits | Description |
|---|---|---|
| Search | DuckDuckGo, Wikipedia, Arxiv, PubMed | Web, knowledge, and academic search |
| Communication | Slack, Gmail, Discord | Messaging and email integration |
| Development | GitHub, Shell, HTTP | Code repos, commands, API calls |
| Data | SQL (SQLite), Postgres, DuckDB, JSON, Calculator | Database queries, data processing |
Add to your Cargo.toml:
[dependencies]
sayr-engine = "0.3"
use sayr_engine::{Agent, OpenAIClient, ToolRegistry};
use sayr_engine::tools::{calculator_toolkit, duckduckgo_toolkit};
#[tokio::main]
async fn main() -> sayr_engine::Result<()> {
// Create LLM client
let model = OpenAIClient::from_env()?.with_model("gpt-4o");
// Register tools
let mut tools = ToolRegistry::new();
calculator_toolkit(&mut tools);
duckduckgo_toolkit(&mut tools, Default::default());
// Create agent
let mut agent = Agent::new(model).with_tools(tools);
// Chat
let reply = agent.respond("What is 42 * 17?").await?;
println!("{reply}");
Ok(())
}
use sayr_engine::tools::{register_github_tools, GitHubClient};
let mut tools = ToolRegistry::new();
register_github_tools(&mut tools); // Uses GITHUB_TOKEN env var
// Tools: github_search_repos, github_get_repo, github_list_issues, github_read_file
use sayr_engine::tools::{register_slack_tools, SlackClient};
let mut tools = ToolRegistry::new();
register_slack_tools(&mut tools, std::env::var("SLACK_BOT_TOKEN")?);
// Tools: slack_send_message, slack_list_channels, slack_search
use sayr_engine::tools::register_sql_tools;
let mut tools = ToolRegistry::new();
register_sql_tools(&mut tools, "/path/to/database.db");
// Tools: sql_query (read-only by default), sql_schema
use sayr_engine::tools::{register_arxiv_tools, register_pubmed_tools};
let mut tools = ToolRegistry::new();
register_arxiv_tools(&mut tools); // Search arXiv papers
register_pubmed_tools(&mut tools); // Search PubMed
// Tools: arxiv_search, pubmed_search
use sayr_engine::{WindowedMemoryStrategy, SummarizedMemoryStrategy};
// Keep last 10 messages
let strategy = WindowedMemoryStrategy::new(10);
// Or use summarization for long conversations
let strategy = SummarizedMemoryStrategy::new(5, 5);
use sayr_engine::guardrails::{PiiGuardrail, PromptInjectionGuardrail, GuardrailChain};
let mut chain = GuardrailChain::new();
chain.add(PiiGuardrail::new());
chain.add(PromptInjectionGuardrail::new());
// Check input before sending to LLM
match chain.validate("My SSN is 123-45-6789") {
GuardrailResult::Block(reason) => println!("Blocked: {}", reason),
GuardrailResult::Pass => println!("Safe to proceed"),
}
use sayr_engine::mcp::{McpClient, StdioTransport, McpTools};
// Connect to MCP server
let transport = StdioTransport::spawn("npx", &["-y", "@modelcontextprotocol/server-filesystem", "/tmp"])?;
let client = McpClient::new(transport);
client.initialize().await?;
// Register MCP tools with agent
let mcp_tools = McpTools::new(client);
mcp_tools.register_all(&mut tools).await?;
use sayr_engine::{init_tracing, init_prometheus_registry};
// Initialize OpenTelemetry tracing
init_tracing("my-agent", Some("http://otel-collector:4317"));
// Initialize Prometheus metrics
init_prometheus_registry();
sayr-engine/
├── src/
│ ├── agent.rs # Core agent loop
│ ├── llm.rs # 10 LLM provider clients
│ ├── tools/ # 13 built-in toolkits
│ ├── guardrails.rs # PII and injection detection
│ ├── memory.rs # Memory strategies
│ ├── mcp.rs # MCP client
│ ├── knowledge/ # RAG and vector stores
│ ├── reasoning.rs # Chain-of-thought
│ └── server.rs # HTTP runtime
├── libs/
│ └── sayr-py/ # PyO3 Python bindings
├── cookbook/ # Example agents
└── scripts/ # Development utilities
You can use the Rust implementation from Python with familiar Agno syntax:
from sayr import Agent, OpenAIChat
agent = Agent(
model=OpenAIChat(id="gpt-4"),
description="You are a helpful assistant",
)
agent.print_response("Hello from Rust!")
# Run all tests
cargo test
# Run with coverage
cargo tarpaulin
# Format and lint
./scripts/check.sh
MIT License - see LICENSE for details.
Inspired by agno-agi/agno - the Python agent framework.