| Crates.io | enki-runtime |
| lib.rs | enki-runtime |
| version | 0.1.4 |
| created_at | 2026-01-01 19:11:15.054942+00 |
| updated_at | 2026-01-03 17:33:47.9276+00 |
| description | A Rust-based agent mesh framework for building local and distributed AI agent systems |
| homepage | https://github.com/enkiai/enki |
| repository | https://github.com/enkiai/enki |
| max_upload_size | |
| id | 2017276 |
| size | 284,273 |
A Rust-based agent mesh framework for building local and distributed AI agent systems.
Enki Runtime is modular, split into focused sub-crates:
| Crate | Description |
|---|---|
enki-core |
Core abstractions: Agent, Memory, Mesh, Message |
enki-llm |
LLM integration with multi-provider support |
enki-local |
Local mesh implementation |
enki-memory |
Memory backend implementations |
enki-observability |
Logging and metrics |
enki-mcp |
Model Context Protocol (optional) |
The enki-runtime umbrella crate re-exports all components for convenience.
[dependencies]
enki-runtime = "0.1"
sqlite - Enable SQLite memory backendredis - Enable Redis memory backendmcp - Enable Model Context Protocol supportfull - Enable all optional features[dependencies]
enki-runtime = { version = "0.1", features = ["sqlite", "redis", "mcp"] }
use enki_runtime::{Agent, AgentContext, LocalMesh, Message};
use enki_runtime::core::error::Result;
use enki_runtime::core::mesh::Mesh;
use async_trait::async_trait;
struct MyAgent {
name: String,
}
#[async_trait]
impl Agent for MyAgent {
fn name(&self) -> String {
self.name.clone()
}
async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
println!("Received: {:?}", msg.topic);
Ok(())
}
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let agent = MyAgent { name: "my-agent".to_string() };
let mesh = LocalMesh::new("my-mesh");
mesh.add_agent(Box::new(agent)).await?;
mesh.start().await?;
Ok(())
}
use enki_runtime::LlmAgent;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create an LLM agent with Ollama (no API key needed)
let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
.with_system_prompt("You are a helpful assistant.")
.with_temperature(0.7)
.build()?;
// Use directly (without mesh)
let mut ctx = enki_runtime::AgentContext::new("test".to_string(), None);
let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
println!("Response: {}", response);
Ok(())
}
Load agents from TOML files:
use enki_runtime::{LlmAgent, LlmAgentFromConfig};
use enki_runtime::config::AgentConfig;
let config = AgentConfig::from_file("agent.toml")?;
let agent = LlmAgent::from_config(config)?;
| Component | Description |
|---|---|
Agent |
Trait for defining agent behavior |
LocalMesh |
Local multi-agent coordination |
LlmAgent |
Pre-built agent with LLM capabilities |
Memory |
Trait for memory backends |
InMemoryBackend |
In-memory storage (default) |
SqliteBackend |
SQLite persistent storage (feature: sqlite) |
RedisBackend |
Redis distributed storage (feature: redis) |
McpClient |
MCP client for external tools (feature: mcp) |
Run examples with:
cargo run --example llm_ollama
cargo run --example toml_agents
cargo run --example mesh_architecture
cargo run --example mcp_client --features mcp
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.