sruim-crewai

Crates.iosruim-crewai
lib.rssruim-crewai
version0.1.0
created_at2026-01-19 15:16:49.115837+00
updated_at2026-01-19 15:16:49.115837+00
descriptionA high-performance multi-agent orchestration engine written in Rust
homepagehttps://github.com/Sruimeng/CrewAI-RS
repositoryhttps://github.com/Sruimeng/CrewAI-RS
max_upload_size
id2054795
size176,447
Sruim (Sruimeng)

documentation

https://docs.rs/sruim-crewai

README

Sruim-CrewAI

中文文档

A high-performance multi-agent orchestration engine written in Rust.

Features

  • Actor Model: Agents run as isolated Tokio tasks, communicating via channels
  • Type Safety: Compile-time guarantees, no runtime "AttributeError"
  • Zero-Cost Abstractions: Compiles to single binary, no interpreter needed
  • LLM Integration: Built-in Anthropic Claude support, extensible provider trait
  • Tool System: JSON Schema auto-generation via schemars
  • DAG Scheduling: Task dependencies with topological execution

Quick Start

[dependencies]
sruim-crewai = "0.1"
tokio = { version = "1.43", features = ["full"] }
dotenvy = "0.15"  # Optional: for .env loading
use sruim_crewai::agent::LlmAgent;
use sruim_crewai::llm::AnthropicProvider;
use sruim_crewai::tool::ToolRegistry;
use sruim_crewai::types::{AgentConfig, Message, TaskId};
use std::sync::Arc;
use tokio::sync::mpsc;

#[tokio::main]
async fn main() {
    dotenvy::dotenv().ok();

    let provider = Arc::new(AnthropicProvider::from_env().unwrap());
    let (crew_tx, mut crew_rx) = mpsc::channel(32);

    let config = AgentConfig::new("Assistant", "Help users");
    let (agent_tx, _) = LlmAgent::spawn(config, provider, ToolRegistry::new(), crew_tx);

    agent_tx.send(Message::AssignTask {
        task_id: TaskId::new(),
        description: "What is 2 + 2?".into(),
        context: String::new(),
    }).await.unwrap();

    if let Some(Message::TaskResult { output, .. }) = crew_rx.recv().await {
        println!("{}", output);
    }

    agent_tx.send(Message::Terminate).await.unwrap();
}

Environment Variables

LLM_API_KEY=your-api-key
LLM_BASE_URL=https://api.anthropic.com/v1/messages
LLM_MODEL=claude-3-5-sonnet-20241022

Architecture

┌─────────────────────────────────────────┐
│                 Crew                     │
│  ┌────────┐  ┌────────┐  ┌────────┐    │
│  │Agent A │  │Agent B │  │Agent C │    │
│  │(Tokio) │  │(Tokio) │  │(Tokio) │    │
│  └───┬────┘  └───┬────┘  └───┬────┘    │
│      └───────────┼───────────┘          │
│           mpsc::channel                  │
└─────────────────────────────────────────┘

Testing

# Unit tests
cargo test

# E2E tests (requires API key)
cargo test --test e2e -- --ignored

Documentation

See docs/getting-started.md for full documentation.

License

MIT

Commit count: 7

cargo fmt