| Crates.io | ceylon-runtime |
| lib.rs | ceylon-runtime |
| version | 0.1.3 |
| created_at | 2025-12-20 04:22:12.718428+00 |
| updated_at | 2025-12-30 13:26:23.5698+00 |
| description | A Rust-based agent mesh framework for building local and distributed AI agent systems |
| homepage | https://github.com/ceylonai/ceylon |
| repository | https://github.com/ceylonai/ceylon |
| max_upload_size | |
| id | 1996001 |
| size | 272,300 |
A Rust-based agent mesh framework for building local and distributed AI agent systems.
Ceylon Runtime is modular, split into focused sub-crates:
| Crate | Description |
|---|---|
ceylon-core |
Core abstractions: Agent, Memory, Mesh, Message |
ceylon-llm |
LLM integration with multi-provider support |
ceylon-local |
Local mesh implementation |
ceylon-memory |
Memory backend implementations |
ceylon-observability |
Logging and metrics |
ceylon-mcp |
Model Context Protocol (optional) |
The ceylon-runtime umbrella crate re-exports all components for convenience.
[dependencies]
ceylon-runtime = "0.1"
sqlite - Enable SQLite memory backendredis - Enable Redis memory backendmcp - Enable Model Context Protocol supportfull - Enable all optional features[dependencies]
ceylon-runtime = { version = "0.1", features = ["sqlite", "redis", "mcp"] }
use ceylon_runtime::{Agent, AgentContext, LocalMesh, Message};
use ceylon_runtime::core::error::Result;
use ceylon_runtime::core::mesh::Mesh;
use async_trait::async_trait;
struct MyAgent {
name: String,
}
#[async_trait]
impl Agent for MyAgent {
fn name(&self) -> String {
self.name.clone()
}
async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
println!("Received: {:?}", msg.topic);
Ok(())
}
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let agent = MyAgent { name: "my-agent".to_string() };
let mesh = LocalMesh::new("my-mesh");
mesh.add_agent(Box::new(agent)).await?;
mesh.start().await?;
Ok(())
}
use ceylon_runtime::LlmAgent;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create an LLM agent with Ollama (no API key needed)
let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
.with_system_prompt("You are a helpful assistant.")
.with_temperature(0.7)
.build()?;
// Use directly (without mesh)
let mut ctx = ceylon_runtime::AgentContext::new("test".to_string(), None);
let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
println!("Response: {}", response);
Ok(())
}
Load agents from TOML files:
use ceylon_runtime::{LlmAgent, LlmAgentFromConfig};
use ceylon_runtime::config::AgentConfig;
let config = AgentConfig::from_file("agent.toml")?;
let agent = LlmAgent::from_config(config)?;
| Component | Description |
|---|---|
Agent |
Trait for defining agent behavior |
LocalMesh |
Local multi-agent coordination |
LlmAgent |
Pre-built agent with LLM capabilities |
Memory |
Trait for memory backends |
InMemoryBackend |
In-memory storage (default) |
SqliteBackend |
SQLite persistent storage (feature: sqlite) |
RedisBackend |
Redis distributed storage (feature: redis) |
McpClient |
MCP client for external tools (feature: mcp) |
Run examples with:
cargo run --example llm_ollama
cargo run --example toml_agents
cargo run --example mesh_architecture
cargo run --example mcp_client --features mcp
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.