| Crates.io | rust-agent |
| lib.rs | rust-agent |
| version | 0.0.5 |
| created_at | 2025-09-21 03:50:02.830206+00 |
| updated_at | 2025-11-03 17:12:45.966085+00 |
| description | Next Generation AI Agent Framework for Web3. |
| homepage | https://www.openpick.org |
| repository | https://github.com/aiqubits/rust-agent |
| max_upload_size | |
| id | 1848404 |
| size | 317,038 |
Rust Agent is a powerful and flexible AI Agent framework. It provides a comprehensive set of tools and components for building complex AI agents that can interact with various systems and perform complex tasks.
The framework consists of several key modules:
Defines the fundamental Runnable trait and related components, forming the foundation for all executable components in the framework.
Provides interfaces and implementations for various AI models:
ChatModel: Chat-based model interfaceOpenAIChatModel: OpenAI-compatible API implementationImplements core agent logic with Agent and AgentRunner interfaces:
McpAgent: Main agent implementation with MCP service integrationSimpleAgent: Basic agent implementation for simple use casesDefines tool interfaces and implementation mechanisms:
Tool: Core tool interfaceToolkit: Interface for managing related tool groupsMcpToolAdapter: Adapter for integrating MCP tools with the framework's tool systemProvides components for interacting with MCP services:
McpClient: Interface for MCP client implementationsSimpleMcpClient: Basic MCP client implementationMcpServer: Interface for MCP server implementationsSimpleMcpServer: Basic MCP server implementationProvides memory management components:
BaseMemory: Base memory interfaceSimpleMemory: Simple memory implementationMessageHistoryMemory: Message history memory implementationSummaryMemory: Summary memory implementationCompositeMemory: Composite memory implementation combining multiple memory strategiesAdd the following to your Cargo.toml:
[dependencies]
rust-agent = "0.0.5"
Here's a simple example of creating an AI agent using the framework:
use rust_agent::{McpAgent, SimpleMcpClient, McpTool, ChatMessage, ChatMessageContent, AgentOutput};
use std::sync::Arc;
use std::collections::HashMap;
// Create MCP client
let mut mcp_client = SimpleMcpClient::new("http://localhost:8080".to_string());
// Add some MCP tools
mcp_client.add_tools(vec![
McpTool {
name: "get_weather".to_string(),
description: "Get weather information for a specified city".to_string(),
}
]);
// Wrap MCP client in Arc
let mcp_client_arc = Arc::new(mcp_client);
// Create McpAgent instance
let mut agent = McpAgent::new(
mcp_client_arc.clone(),
"You are a helpful assistant".to_string()
);
// Automatically add tools from MCP client
if let Err(e) = agent.auto_add_tools().await {
println!("Failed to automatically add tools to McpAgent: {}", e);
}
// Build user input
let mut input = HashMap::new();
input.insert("input".to_string(), "What's the weather like in Beijing?".to_string());
// Call agent to process input
let result = agent.invoke(input).await;
// Handle result
match result {
Ok(AgentOutput::Finish(finish)) => {
if let Some(answer) = finish.return_values.get("answer") {
println!("AI Response: {}", answer);
}
},
Ok(AgentOutput::Action(action)) => {
println!("Need to call tool: {}", action.tool);
// Execute tool call...
if let Some(thought) = &action.thought {
println!("Thought process: {}", thought);
}
},
Err(e) => {
println!("Error occurred: {}", e);
}
}
The framework now includes built-in MCP server implementation:
use rust_agent::{SimpleMcpServer, McpServer, ExampleTool};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create MCP server instance
let server = SimpleMcpServer::new().with_address("127.0.0.1:6000".to_string());
// Create example tools
let weather_tool = ExampleTool::new(
"get_weather".to_string(),
"Get weather information for a specified city".to_string()
);
let calculator_tool = ExampleTool::new(
"calculate".to_string(),
"Perform simple mathematical calculations".to_string()
);
// Register tools with server
server.register_tool(Box::new(weather_tool))?;
server.register_tool(Box::new(calculator_tool))?;
// Start server
server.start("127.0.0.1:6000").await?;
println!("MCP server started at 127.0.0.1:6000");
println!("Registered tools: get_weather, calculate");
// Simulate server running for some time
tokio::time::sleep(tokio::time::Duration::from_secs(10)).await;
// Stop server
server.stop().await?;
println!("MCP server stopped");
Ok(())
}
The framework supports hybrid mode, allowing simultaneous use of local tools and remote MCP server tools:
use rust_agent::{run_agent, OpenAIChatModel, McpClient, SimpleMcpClient, McpTool, McpAgent, CompositeMemory};
use std::sync::Arc;
use std::collections::HashMap;
use serde_json::{Value, json};
#[tokio::main]
async fn main() {
// Create OpenAI model instance
let model = OpenAIChatModel::new(api_key, base_url)
.with_model("gpt-3.5-turbo")
.with_temperature(0.7);
// Initialize MCP client
let mut mcp_client = SimpleMcpClient::new("http://127.0.0.1:6000".to_string());
// Add local tools
mcp_client.add_tools(vec![
McpTool {
name: "get_local_time".to_string(),
description: "Get current local time and date".to_string(),
},
]);
// Register local tool handler
mcp_client.register_tool_handler("get_local_time".to_string(), |_params: HashMap<String, Value>| async move {
let now = chrono::Local::now();
Ok(json!({
"current_time": now.format("%Y-%m-%d %H:%M:%S").to_string(),
"timezone": "Local"
}))
});
// Connect to MCP server
if let Ok(_) = mcp_client.connect("http://127.0.0.1:6000").await {
mcp_client.set_server_connected(true);
}
// Create memory module
let memory = CompositeMemory::with_basic_params("data".into(), 200, 10).await.unwrap();
// Create Agent instance
let client_arc: Arc<dyn McpClient> = Arc::new(mcp_client);
let mut agent = McpAgent::with_openai_model_and_memory(
client_arc.clone(),
"You are an AI assistant that can use both local tools and remote MCP server tools.".to_string(),
model,
Box::new(memory)
);
// Automatically get tools from MCP client and add to Agent
if let Err(e) = agent.auto_add_tools().await {
eprintln!("Warning: Failed to automatically add tools to McpAgent: {}", e);
}
// Use Agent to process user input
match run_agent(&agent, "What time is it now?".to_string()).await {
Ok(response) => println!("Assistant: {}", response),
Err(e) => println!("Error: {}", e),
}
}
To create custom tools for MCP servers, you need to implement the Tool trait:
use rust_agent::Tool;
use anyhow::Error;
use std::pin::Pin;
pub struct CustomTool {
name: String,
description: String,
}
impl CustomTool {
pub fn new(name: String, description: String) -> Self {
Self { name, description }
}
}
impl Tool for CustomTool {
fn name(&self) -> &str {
&self.name
}
fn description(&self) -> &str {
&self.description
}
fn invoke(&self, input: &str) -> Pin<Box<dyn std::future::Future<Output = Result<String, Error>> + Send + '_>> {
let input_str = input.to_string();
let name = self.name.clone();
Box::pin(async move {
// Your custom tool logic
Ok(format!("Custom tool {} processed: {}", name, input_str))
})
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
}
The project provides several examples demonstrating how to use the framework to build different types of AI agents. Examples are located in the examples/ directory.
agent_example.rs: Basic agent usage examplemcp_agent_client_chatbot.rs: MCP client chatbot example (server-side tools only)mcp_agent_hybrid_chatbot.rs: Hybrid mode MCP agent example (local get_local_time tool + server-side tools)mcp_agent_local_chatbot.rs: Local MCP agent chatbot example (local tools only)mcp_server_complete_example.rs: Complete MCP server example with real tool implementations (providing get_weather and simple_calculate tools)agent_example.rs)Shows how to create a simple agent with custom tools:
# Run example
cargo run --example agent_example
mcp_agent_client_chatbot.rs)Demonstrates how to use McpAgent to build a simple chatbot that connects to an MCP server and uses only server-side tools. This example shows a pure client implementation, completely relying on remote tools:
get_weather, simple_calculate)# Run example
cargo run --example mcp_agent_client_chatbot
mcp_agent_hybrid_chatbot.rs)Demonstrates how to use McpAgent in hybrid mode, combining local tools (like get_local_time) with server-side tools. This example shows how an agent can use both local and remote tools based on task requirements:
get_local_time - Get current local time and dateget_weather, simple_calculate)# Run example
cargo run --example mcp_agent_hybrid_chatbot
mcp_agent_local_chatbot.rs)Demonstrates how to use McpAgent with only local tools. This example shows how an agent can run without connecting to any remote MCP server, using only locally implemented tools:
get_weather - Get weather information for a specified citysimple_calculate - Perform simple mathematical calculations# Run example
cargo run --example mcp_agent_local_chatbot
mcp_server_complete_example.rs)A more complete example showing how to implement custom tools with actual functionality, such as get_weather and simple_calculate:
# Run example
cargo run --example mcp_server_complete_example
The project includes unit tests to verify the framework's functionality:
# Run all tests
cargo test
To build the project, simply run:
# Build project
cargo build
The project builds successfully with only some warnings about unused fields in structs, which do not affect functionality.
When using the framework, you may need to configure the following environment variables:
OPENAI_API_KEY: OpenAI compatible API keyOPENAI_API_URL: OpenAI compatible API base URL (optional, defaults to official OpenAI API)OPENAI_API_MODEL: OpenAI compatible API model name (optional, defaults to gpt-3.5-turbo)MCP_URL: MCP server URL (optional, defaults to http://127.0.0.1:6000)Tool interface or use McpToolAdapterIf you'd like to contribute to the project, please follow these steps: