Crates.io | prism-mcp-rs |
lib.rs | prism-mcp-rs |
version | 1.1.1 |
created_at | 2025-08-10 01:28:10.80228+00 |
updated_at | 2025-09-13 21:18:54.560205+00 |
description | Production-grade Rust SDK for Model Context Protocol (MCP) - Build AI agents, LLM integrations, and assistant tools with enterprise features |
homepage | https://github.com/prismworks-ai/prism-mcp-rs |
repository | https://github.com/prismworks-ai/prism-mcp-rs |
max_upload_size | |
id | 1788362 |
size | 1,726,711 |
prism-mcp-rs is a production-grade Rust implementation of the Model Context Protocol (MCP) SDK with enterprise-class features for building secure, scalable MCP servers and clients.
The first MCP SDK designed for production AI systems. While other implementations focus on basic protocol compliance, Prism MCP brings enterprise-grade reliability patterns, zero-downtime operations, and plugin ecosystems that scale.
Built for the AI-first world: Where services need to be fault-tolerant, discoverable, and composable. Where hot-swapping capabilities matters more than cold starts. Where observability isn't optionalβit's survival.
From prototype to production in minutes: Clean APIs that hide complexity, but expose power when you need it.
Component | Description | Key Features |
---|---|---|
Transport Layer | Multi-protocol transport abstraction | STDIO, HTTP/1.1, HTTP/2, WebSocket, SSE |
Protocol Engine | MCP 2025-06-18 implementation | JSON-RPC, batch operations, streaming |
Plugin Runtime | Dynamic extension system | Hot reload, sandboxing, versioning |
Resilience Core | Fault tolerance mechanisms | Circuit breakers, retries, health checks |
Security Module | Authentication and authorization | JWT, OAuth2, mTLS, rate limiting |
[dependencies]
prism-mcp-rs = "0.1.0"
tokio = { version = "1", features = ["full"] }
serde_json = "1.0"
async-trait = "0.1"
Feature Category | Features | Use Case |
---|---|---|
Core Transports | stdio , http , websocket |
Basic connectivity |
HTTP Extensions | sse , http2 , chunked-encoding , compression |
Advanced HTTP capabilities |
Security | auth , tls |
Authentication and encryption |
Extensions | plugin |
Runtime extensibility |
Bundles | full , minimal |
Convenience feature sets |
# High-performance configuration
[dependencies]
prism-mcp-rs = {
version = "0.1.0",
features = ["http2", "compression", "plugin", "auth", "tls"]
}
# Memory-constrained environments
[dependencies]
prism-mcp-rs = {
version = "0.1.0",
default-features = false,
features = ["stdio"]
}
use prism_mcp_rs::prelude::*;
use std::collections::HashMap;
#[derive(Clone)]
struct SystemToolHandler;
#[async_trait]
impl ToolHandler for SystemToolHandler {
async fn call(&self, arguments: HashMap<String, Value>) -> McpResult<ToolResult> {
match arguments.get("tool_name").and_then(|v| v.as_str()) {
Some("system_info") => {
let info = format!(
"Host: {}, OS: {}",
hostname::get().unwrap_or_default().to_string_lossy(),
std::env::consts::OS
);
Ok(ToolResult {
content: vec![ContentBlock::text(&info)],
is_error: Some(false),
meta: None,
structured_content: None,
})
},
_ => Err(McpError::invalid_request("Unknown tool"))
}
}
}
#[tokio::main]
async fn main() -> McpResult<()> {
let mut server = McpServer::new(
"system-server".to_string(),
"1.0.0".to_string()
);
// Add the system_info tool using the async add_tool method
server.add_tool(
"system_info",
Some("Get system information"),
json!({"type": "object", "properties": {}}),
SystemToolHandler,
).await?;
// Start server with STDIO transport
let transport = StdioServerTransport::new();
server.start(transport).await
}
use prism_mcp_rs::prelude::*;
use prism_mcp_rs::transport::StdioClientTransport;
let transport = StdioClientTransport::new("./server");
let client = McpClient::new(
"test-client".to_string(),
"1.0.0".to_string()
);
client.initialize().await?;
client.set_transport(Box::new(transport)).await?;
// Make requests to the server
let tools_response = client.list_tools(None, None).await?;
let tool_result = client.call_tool(
"system_info".to_string(),
json!({})
).await?;
println!("Available tools: {:?}", tools_response);
println!("Tool result: {:?}", tool_result);
use prism_mcp_rs::prelude::*;
use prism_mcp_rs::plugin::*;
use std::any::Any;
struct WeatherPlugin {
api_key: String,
}
#[async_trait]
impl ToolPlugin for WeatherPlugin {
fn metadata(&self) -> PluginMetadata {
PluginMetadata {
id: "weather-plugin".to_string(),
name: "Weather Plugin".to_string(),
version: "1.0.0".to_string(),
author: Some("Example Author".to_string()),
description: Some("Provides weather information".to_string()),
homepage: None,
license: Some("MIT".to_string()),
mcp_version: "1.1.0".to_string(),
capabilities: PluginCapabilities::default(),
dependencies: vec![],
}
}
fn tool_definition(&self) -> Tool {
Tool::new(
"get_weather".to_string(),
Some("Get weather information for a location".to_string()),
json!({
"type": "object",
"properties": {
"location": {"type": "string", "description": "Location to get weather for"}
},
"required": ["location"]
}),
EchoTool // Placeholder - would use actual weather handler
)
}
async fn execute(&self, arguments: Value) -> McpResult<ToolResult> {
let location = arguments["location"].as_str().unwrap_or("Unknown");
let weather_data = json!({
"location": location,
"temperature": "22Β°C",
"condition": "Sunny"
});
Ok(ToolResult {
content: vec![ContentBlock::text(&format!("Weather in {}: {}", location, weather_data))],
is_error: Some(false),
meta: None,
structured_content: Some(weather_data),
})
}
fn as_any(&self) -> &dyn Any {
self
}
}
// Runtime plugin management
let mut plugin_manager = PluginManager::new();
plugin_manager.reload_plugin("weather_plugin").await?; // Hot reload support
Automatic capability negotiation and runtime schema introspection eliminates manual configuration:
// Client connects and discovers server capabilities
let transport = StdioClientTransport::new("./server");
let client = McpClient::new("discovery-client".to_string(), "1.0.0".to_string());
client.initialize().await?;
client.set_transport(Box::new(transport)).await?;
// Discover server capabilities through initialization
let server_info = client.get_server_info().await?;
println!("Server capabilities: {:?}", server_info.capabilities);
Built-in resilience patterns prevent cascading failures in distributed AI systems:
// Basic tool call with proper error handling
let result = match client.call_tool("analyze".to_string(), data).await {
Ok(result) => result,
Err(e) => {
eprintln!("Tool call failed: {}", e);
// Implement your own fallback logic here
return Err(e);
}
};
Hot-swappable plugins with ABI stability across Rust versions:
// Plugin lifecycle management
plugin_manager.unload_plugin("analyzer_v1").await?;
plugin_manager.load_plugin("analyzer_v2.so").await?;
// Plugin health monitoring
let health = plugin_manager.check_plugin_health("analyzer_v2").await?;
Combine multiple AI services with automatic failover and load balancing.
Connect legacy systems to modern AI tools with protocol translation and security policies.
Build streaming data processing pipelines with sub-millisecond latency guarantees.
Create distributed AI service meshes with automatic service discovery and routing.
Deploy AI capabilities to edge devices with offline-first architecture and smart sync.
Metric | Value | Impact |
---|---|---|
Zero-downtime deployments | < 100ms | Keep AI services running during updates |
Automatic failover | < 50ms | No user-visible service interruptions |
Memory efficiency | 2-12MB baseline | Deploy to edge and resource-constrained environments |
Protocol overhead | < 0.5ms | Sub-millisecond response times for real-time AI |
# Run security audit
./scripts/security-audit.sh
# Update dependencies safely
./scripts/update-dependencies.sh
# Check supply chain status
cargo vet check
cargo deny check all
cargo audit
Contributions are welcome! Please review our Contributing Guidelines and Code of Conduct.
See our Contributors for a list of everyone who has contributed to this project.
MIT License - see LICENSE for details.