| Crates.io | rswarm |
| lib.rs | rswarm |
| version | 0.1.7 |
| created_at | 2024-11-21 18:02:16.231456+00 |
| updated_at | 2025-02-24 01:00:41.193191+00 |
| description | A Rust implementation of the Swarm framework |
| homepage | |
| repository | |
| max_upload_size | |
| id | 1456512 |
| size | 356,652 |
Welcome, fellow Rustacean! If you’re aiming to integrate advanced AI agent interactions into your Rust applications, you’ve come to the right place. rswarm is a powerful and user-friendly library designed to simplify and enhance your AI development experience in Rust.
Embark on this journey with us as we explore how rswarm can empower your projects with intelligent agent capabilities.
rswarm is a Rust library crafted to streamline AI agent interactions, particularly when working with OpenAI’s API. It provides a robust framework for:
Whether you’re building a chatbot, an AI assistant, or any application requiring intelligent dialogue, rswarm equips you with the tools to make it happen efficiently.
This project, rswarm, is inspired by and extends the concepts introduced in the Swarm framework developed by OpenAI. Swarm is an educational framework that explores ergonomic, lightweight multi-agent orchestration. It provides a foundation for agent coordination and execution through abstractions like Agents and handoffs, allowing for scalable and customizable solutions.
We would like to express our gratitude to the OpenAI team for their innovative work on Swarm, which has significantly influenced the development of rswarm. Special thanks to the core contributors of Swarm, including Ilan Bigio, James Hills, Shyamal Anadkat, Charu Jaiswal, Colin Jarvis, and Katia Gil Guzman, among others.
By building upon Swarm, rswarm aims to bring these powerful concepts into the Rust ecosystem, enhancing them to suit our specific needs and preferences. We hope to continue pushing the boundaries of what's possible with Rust and AI, inspired by the groundwork laid by OpenAI.
Feel free to explore the rswarm framework further, contribute to its development, or reach out with questions. Together, we can continue to innovate and expand the capabilities of AI agent interactions.
Happy coding!
To get started with rswarm, you need to add it to your project’s dependencies. Ensure you have Rust and Cargo installed on your system.
In your Cargo.toml file, add:
cargo add rswarm
After updating Cargo.toml, fetch the dependencies by running:
cargo build
rswarm relies on environment variables for configuration:
Set them in your shell or a .env file:
export OPENAI_API_KEY="your-api-key"
export OPENAI_API_URL="https://api.openai.com/v1/chat/completions" # Optional
In your Rust application, load the .env file:
dotenv::dotenv().ok();
Note: Keep your API key secure and avoid committing it to version control.
Let’s dive into some examples to see rswarm in action.
The Swarm struct is the heart of the library, managing API communication and agent interactions. You can create a Swarm instance using the builder pattern.
use rswarm::Swarm;
let swarm = Swarm::builder()
.build()
.expect("Failed to create Swarm");
If you’ve set the OPENAI_API_KEY environment variable, you can omit the .with_api_key() method. If you prefer to pass the API key directly:
let swarm = Swarm::builder()
.with_api_key("your-api-key".to_string())
.build()
.expect("Failed to create Swarm");
An Agent encapsulates the behavior and capabilities of an AI assistant.
use rswarm::{Agent, Instructions};
let agent = Agent {
name: "assistant".to_string(),
model: "gpt-3.5-turbo".to_string(),
instructions: Instructions::Text("You are a helpful assistant.".to_string()),
functions: vec![],
function_call: None,
parallel_tool_calls: false,
};
Instructions guide the agent’s behavior. They can be:
String.Example of dynamic instructions:
use rswarm::{Instructions, ContextVariables};
use std::sync::Arc;
let dynamic_instructions = Instructions::Function(Arc::new(|context: ContextVariables| {
format!(
"You are a helpful assistant aware of the user's location: {}.",
context.get("location").unwrap_or(&"unknown".to_string())
)
}));
Let’s initiate a conversation with our agent.
use rswarm::{Message, ContextVariables};
use std::collections::HashMap;
let messages = vec![Message {
role: "user".to_string(),
content: Some("Hello, assistant!".to_string()),
name: None,
function_call: None,
}];
let context_variables = ContextVariables::new(); // An empty context
let response = swarm
.run(
agent.clone(),
messages,
context_variables,
None, // No model override
false, // Streaming disabled
false, // Debug mode off
5 // Max turns
)
.await
.expect("Failed to run the conversation");
for msg in response.messages {
println!("{}: {}", msg.role, msg.content.unwrap_or_default());
}
The agent responds according to the instructions provided.
For real-time applications, you can enable streaming to receive incremental responses.
Instead of calling run(), create a Streamer from the Swarm’s client and API key:
use rswarm::stream::Streamer;
use futures_util::StreamExt;
use std::collections::HashMap;
let streamer = Streamer::new(swarm.client.clone(), swarm.api_key.clone());
let history = Vec::new();
let context_variables = HashMap::new();
let agent = agent.clone(); // The primary agent
println!("Starting streaming conversation output:");
let mut stream = streamer.stream_chat(&agent, &history, &context_variables, None, false);
// Process each streamed message as soon as it arrives.
while let Some(item) = stream.next().await {
match item {
Ok(message) => {
println!(
"{} {}: {}",
message.name.as_deref().unwrap_or("Unknown"),
message.role,
message.content.as_deref().unwrap_or("")
);
println!("--------------------------------");
}
Err(e) => eprintln!("Stream error: {}", e),
}
}
println!("Streaming conversation completed.");
In this example, the agent’s response is received incrementally via the Streamer.
Let’s explore rswarm in greater detail, uncovering its full potential.
Customize Swarm behavior using SwarmConfig.
use rswarm::{Swarm, SwarmConfig};
let custom_config = SwarmConfig {
request_timeout: 60,
max_retries: 5,
..Default::default()
};
let swarm = Swarm::builder()
.with_config(custom_config)
.build()
.expect("Failed to create Swarm with custom configuration");
Adjust parameters like timeouts and retries based on application needs.
Note: In rswarm version 2.0, the AgentFunction has been refactored to be asynchronous. This means that the stored function must now return a pinned boxed future (using Box::pin(async move { ... })) rather than a plain synchronous result. This change removes the need for blocking calls and makes integration with asynchronous runtimes seamless.
Below is an updated example:
use rswarm::{AgentFunction, ContextVariables, ResultType};
use std::future::Future;
use std::pin::Pin;
use std::sync::Arc;
let echo_function = AgentFunction {
name: "echo".to_string(),
function: Arc::new(|args: ContextVariables| -> Pin<Box<dyn Future<Output = Result<ResultType, anyhow::Error>> + Send>> {
Box::pin(async move {
let message = args.get("message").cloned().unwrap_or_default();
Ok(ResultType::Value(message))
})
}),
accepts_context_variables: true,
};
agent.functions.push(echo_function);
agent.function_call = Some("auto".to_string());
With function_call set to "auto", the agent decides when to use the functions.
Context variables provide dynamic data to agents.
let mut context_variables = ContextVariables::new();
context_variables.insert("location".to_string(), "Berlin".to_string());
let dynamic_instructions = Instructions::Function(Arc::new(|context: ContextVariables| {
format!("You are a helpful assistant. The user's location is {}.", context.get("location").unwrap())
}));
agent.instructions = dynamic_instructions;
The agent tailors responses based on the context provided.
Agents can call functions during conversations to perform specific tasks.
Define a function, add it to the agent, and then proceed with a conversation:
use rswarm::{AgentFunction, ContextVariables, ResultType};
use std::future::Future;
use std::pin::Pin;
use std::sync::Arc;
let echo_function = AgentFunction {
name: "echo".to_string(),
function: Arc::new(|args: ContextVariables| -> Pin<Box<dyn Future<Output = Result<ResultType, anyhow::Error>> + Send>> {
Box::pin(async move {
let message = args.get("message").cloned().unwrap_or_default();
Ok(ResultType::Value(message))
})
}),
accepts_context_variables: true,
};
agent.functions.push(echo_function);
agent.function_call = Some("auto".to_string());
let messages = vec![Message {
role: "user".to_string(),
content: Some("Repeat after me: Hello World!".to_string()),
name: None,
function_call: None,
}];
let response = swarm
.run(
agent.clone(),
messages,
ContextVariables::new(),
None,
false,
false,
5
)
.await
.expect("Failed to run the conversation");
for msg in response.messages {
println!("{}: {}", msg.role, msg.content.unwrap_or_default());
}
rswarm also allows for XML definitions to structure multi-step interactions.
<steps>
<step number="1" action="run_once">
<prompt>Introduce yourself.</prompt>
</step>
<step number="2" action="loop" agent="assistant">
<prompt>Answer the user's questions until they say 'goodbye'.</prompt>
</step>
</steps>
use rswarm::{extract_xml_steps, parse_steps_from_xml, Steps};
let instructions = r#"
You are about to engage in a conversation.
<steps>
<step number="1" action="run_once">
<prompt>Introduce yourself.</prompt>
</step>
<step number="2" action="loop" agent="assistant">
<prompt>Answer the user's questions until they say 'goodbye'.</prompt>
</step>
</steps>
Proceed with the conversation.
"#;
let (instructions_without_xml, xml_steps) = extract_xml_steps(instructions).unwrap();
let steps = if let Some(xml_content) = xml_steps {
parse_steps_from_xml(&xml_content).unwrap()
} else {
Steps { steps: Vec::new() }
};
The Swarm’s run() method automatically handles the execution of steps defined in XML.
We’ve explored the landscape of rswarm—how it simplifies AI agent interactions in Rust while providing advanced features like streaming responses, agent functions, and XML-based execution flows. Whether you’re just starting with AI in Rust or pushing the boundaries of complex interactions, rswarm provides a robust foundation to build upon.
Happy coding!
pub struct Swarm {
pub client: Client,
pub api_key: String,
pub agent_registry: HashMap<String, Agent>,
pub config: SwarmConfig,
}
Purpose: Manages API communication and agent interactions. Key Methods:
run(): Executes a conversation (batch mode or with XML-defined steps).builder(): Initializes a SwarmBuilder.get_agent_by_name(): Retrieves an agent from the registry.pub struct Agent {
pub name: String,
pub model: String,
pub instructions: Instructions,
pub functions: Vec<AgentFunction>,
pub function_call: Option<String>,
pub parallel_tool_calls: bool,
}
Purpose: Defines an AI assistant’s behavior. Fields:
name: Unique identifier.model: AI model to be used (e.g., "gpt-3.5-turbo", "gpt-4").instructions: Guides the agent’s responses (static or dynamic).functions: Custom functions available to the agent.function_call: Determines when functions are called.parallel_tool_calls: Enables parallel execution of functions.pub struct AgentFunction {
pub name: String,
pub function: Arc<dyn Fn(ContextVariables) -> Pin<Box<dyn Future<Output = Result<ResultType, anyhow::Error>> + Send>> + Send + Sync>,
pub accepts_context_variables: bool,
}
Purpose: Enables agents to execute custom logic asynchronously. Fields:
name: Identifier for the function.function: The asynchronous function logic.accepts_context_variables: Indicates if context variables are used.pub struct SwarmConfig {
pub api_url: String,
pub api_version: String,
pub request_timeout: u64,
pub connect_timeout: u64,
pub max_retries: u32,
pub max_loop_iterations: u32,
pub valid_model_prefixes: Vec<String>,
pub valid_api_url_prefixes: Vec<String>,
pub loop_control: LoopControl,
pub api_settings: ApiSettings,
}
Purpose: Configures Swarm behavior, including API endpoints, timeouts, and retry logic.
pub enum Instructions {
Text(String),
Function(Arc<dyn Fn(ContextVariables) -> String + Send + Sync>),
}
Purpose: Provides static or dynamic instructions for agents.
pub type ContextVariables = HashMap<String, String>;
Purpose: Stores key-value pairs for dynamic context within conversations.
pub enum ResultType {
Value(String),
Agent(Agent),
ContextVariables(ContextVariables),
}
Purpose: Represents the result of an agent function execution.
The new Streamer struct enables receiving real-time agent responses.
pub struct Streamer {
client: Client,
api_key: String,
}
Key Method:
stream_chat(): Returns an asynchronous stream yielding incremental responses as Message items.This project is licensed under the MIT License.
A heartfelt thank you to all contributors and the Rust community. Your support and collaboration make projects like rswarm possible.
Feel free to explore the library further, contribute to its development, or reach out with questions. Together, we can continue to push the boundaries of what’s possible with Rust and AI.
Happy coding!
────────────────────────────
Note:
In this version, the `AgentFunction` struct now requires that the function field return an asynchronous pinned boxed future. This means that when defining agent functions, wrap your function body using `Box::pin(async move { ... })` to produce the correct return type.
────────────────────────────