Crates.io | mcp-commune |
lib.rs | mcp-commune |
version | 0.6.2 |
created_at | 2025-01-24 16:48:51.396481+00 |
updated_at | 2025-08-13 22:19:28.273429+00 |
description | Rust client and server for building discoverable Model Context Protocol (MCP) networks |
homepage | |
repository | https://github.com/jgmartin/commune |
max_upload_size | |
id | 1529693 |
size | 130,845 |
Commune is a Rust library designed to support the development of discoverable networks of AI agents. It serves as a wrapper over mcp-sdk-rs, providing enhanced functionality for peer discovery and resource utilization within Model Context Protocol (MCP) networks.
Add the following to your Cargo.toml
:
[dependencies]
commune = { package = "mcp-commune", version = "0.1.2" }
use commune::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a peer for each MCP server
let peer1 = PeerBuilder::new()
.with_name("everything".to_string())
.with_url("ws://localhost:8780".to_string())
.with_description("various example resources".to_string())
.build()
.await?;
let peer2 = PeerBuilder::new()
.with_name("memory".to_string())
.with_url("ws://localhost:8781".to_string())
.with_description("memory based on a knowledge graph".to_string())
.build()
.await?;
// Optionally create a commune client
let commune_client = ClientBuilder::new()
.with_peers(vec![peer1.clone(), peer2])
.build()
.await?;
// Use the client to aggregate resources
// Get all tools
let peer_tools = commune_client.all_tools().await?;
log::info!("found {} tools", peer_tools.len());
// Get all resources
let peer_resources = commune_client.all_resources().await?;
log::info!("found {} resources!", peer_resources.len());
// Get all prompts
let peer_prompts = commune_client.all_prompts().await?;
log::info!("found {} prompts!", peer_prompts.len());
// Use peers to utilize them, subscribe to notifications etc
// Subscribe to resource updates
peer1.subscribe("test://static/resource/2").await?;
Ok(())
}
Commune provides convenient type conversion implementations for various inference APIs, including:
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License.