| Crates.io | mcp-rs-sdk |
| lib.rs | mcp-rs-sdk |
| version | 0.2.0 |
| created_at | 2025-07-01 19:38:13.370325+00 |
| updated_at | 2025-07-03 11:09:48.042134+00 |
| description | A Rust SDK for building agents for the Model Context Protocol (MCP). |
| homepage | https://github.com/anggasct/mcp-rs-sdk |
| repository | https://github.com/anggasct/mcp-rs-sdk |
| max_upload_size | |
| id | 1733613 |
| size | 41,077 |
A lightweight, dependency-free Rust SDK for building agents compatible with the Model Context Protocol (MCP).
This SDK provides the necessary data structures and a runtime loop to handle communication with an MCP host (like Claude Desktop or other compatible clients). It allows you to focus on your agent's logic without worrying about the details of the protocol's JSON-based standard I/O communication.
serde crate for robust JSON handling.streaming feature flag (adds futures and tokio).McpError type ensures the library never panics.This SDK aims to be a robust, idiomatic Rust implementation of the Model Context Protocol. The architecture and feature set are heavily inspired by the official TypeScript SDK.
The main goal is to provide functional parity with the TypeScript version, but with a developer experience that feels natural in the Rust ecosystem. This includes offering both simple non-streaming and powerful async streaming agent runners, along with a rich set of helper functions. Credit for the clear and effective API design goes to the authors of the original TypeScript SDK.
Add mcp-rs-sdk to your Cargo.toml. For streaming, enable the streaming feature.
[dependencies]
mcp-rs-sdk = { version = "0.2.0", features = ["streaming"] } # Add "streaming" for streaming support
anyhow = "1.0" # For easy error handling in the examples
tokio = { version = "1", features = ["macros", "rt-multi-thread"] } # For the streaming example
futures = "0.3" # For the streaming example
use mcp_rs_sdk::{run_agent, GetContextRequest, Response, create_content_response, error::McpError};
use anyhow::Result;
fn my_handler(_req: GetContextRequest) -> Result<Response> {
// Your agent logic here. Use helpers to build a response.
Ok(create_content_response("Hello from a non-streaming agent!"))
}
fn main() -> Result<(), McpError> {
run_agent(my_handler)
}
This example shows an agent that defines a tool, receives a request, and executes a tool call.
use mcp_rs_sdk::{
run_streaming_agent, GetContextRequest, PartialResponse,
create_streaming_content_chunk, error::McpError, Tool, ToolInputSchema,
helpers::{create_tool_calls_response, parse_function_args}
};
use anyhow::Result;
use futures::stream::{self, Stream};
use std::convert::Infallible;
use std::collections::HashMap;
use serde::Deserialize;
use serde_json::json;
#[derive(Deserialize)]
struct AddArgs {
a: f64,
b: f64,
}
// 1. Define the tools your agent supports
fn get_tools() -> Vec<Tool> {
vec![Tool {
r#type: "function".to_string(),
function: mcp_rs_sdk::FunctionDefinition {
name: "add".to_string(),
description: Some("Adds two numbers.".to_string()),
parameters: Some(ToolInputSchema {
r#type: "object".to_string(),
properties: Some({
let mut props = HashMap::new();
props.insert("a".to_string(), json!({"type": "number"}));
props.insert("b".to_string(), json!({"type": "number"}));
props
}),
required: Some(vec!["a".to_string(), "b".to_string()]),
}),
},
}]
}
// 2. Implement the handler
fn my_streaming_handler(
req: GetContextRequest
) -> impl Stream<Item = Result<PartialResponse, Infallible>> {
let last_message = req.messages.last().unwrap();
// If the last message was a tool call, execute it
if let Some(tool_calls) = &last_message.tool_calls {
let tool_call = &tool_calls[0];
if tool_call.function.name == "add" {
let args: Result<AddArgs, _> = parse_function_args(tool_call);
let result = args.map(|a| a.a + a.b);
let response = mcp_rs_sdk::helpers::create_streaming_tool_result_chunk(
tool_call.id.clone(),
result.map_or_else(|e| e.to_string(), |r| r.to_string()),
);
return stream::once(async { Ok(response) });
}
}
// Otherwise, respond with a content chunk
stream::once(async { Ok(create_streaming_content_chunk("Hello, world!")) })
}
#[tokio::main]
async fn main() -> Result<(), McpError> {
let _tools = get_tools();
// In a real app, you would pass the tools to an LLM to inform its decisions.
run_streaming_agent(my_streaming_handler).await
}
run_agent(handler): For simple, request-response agents.run_streaming_agent(handler): (Requires streaming feature) For real-time, streaming agents.GetContextRequest: The incoming request from the MCP host.Response: The standard response object created by a handler.GetContextResponse: The final, non-streaming wrapper sent to the host.PartialResponse: A chunk of a streaming response.Message, Role, Tool, ToolCall, ToolResult, ToolInputSchema: All necessary structs for working with the protocol.McpError: The custom error type for all fallible operations.create_content_response(content): Create a simple text response.create_tool_calls_response(tool_calls): Create a response to request a tool call.create_function_response(id, result): Create a tool result response.create_error_response(message): Create an error response.parse_function_args(tool_call): Safely deserialize a tool call's JSON arguments into a struct.create_streaming_*: Streaming versions of the response helpers.