enki-runtime

Crates.ioenki-runtime
lib.rsenki-runtime
version0.1.4
created_at2026-01-01 19:11:15.054942+00
updated_at2026-01-03 17:33:47.9276+00
descriptionA Rust-based agent mesh framework for building local and distributed AI agent systems
homepagehttps://github.com/enkiai/enki
repositoryhttps://github.com/enkiai/enki
max_upload_size
id2017276
size284,273
dewmal (dewmal)

documentation

https://docs.rs/enki-runtime

README

Enki Runtime

A Rust-based agent mesh framework for building local and distributed AI agent systems.

Crates.io Documentation License: MIT

Features

  • Agent Framework: Build autonomous AI agents with a simple trait-based API
  • Local Mesh: Connect multiple agents in a local mesh for inter-agent communication
  • LLM Integration: Built-in support for 13+ LLM providers (OpenAI, Anthropic, Ollama, Google, etc.)
  • Memory Backends: Pluggable memory systems with in-memory, SQLite, and Redis support
  • MCP Support: Model Context Protocol for tool integration
  • Async-first: Built on Tokio for high-performance async operations
  • Observability: Structured logging and metrics collection built-in

Architecture

Enki Runtime is modular, split into focused sub-crates:

Crate Description
enki-core Core abstractions: Agent, Memory, Mesh, Message
enki-llm LLM integration with multi-provider support
enki-local Local mesh implementation
enki-memory Memory backend implementations
enki-observability Logging and metrics
enki-mcp Model Context Protocol (optional)

The enki-runtime umbrella crate re-exports all components for convenience.

Installation

[dependencies]
enki-runtime = "0.1"

Feature Flags

  • sqlite - Enable SQLite memory backend
  • redis - Enable Redis memory backend
  • mcp - Enable Model Context Protocol support
  • full - Enable all optional features
[dependencies]
enki-runtime = { version = "0.1", features = ["sqlite", "redis", "mcp"] }

Quick Start

use enki_runtime::{Agent, AgentContext, LocalMesh, Message};
use enki_runtime::core::error::Result;
use enki_runtime::core::mesh::Mesh;
use async_trait::async_trait;

struct MyAgent {
    name: String,
}

#[async_trait]
impl Agent for MyAgent {
    fn name(&self) -> String {
        self.name.clone()
    }

    async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
        println!("Received: {:?}", msg.topic);
        Ok(())
    }
}

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let agent = MyAgent { name: "my-agent".to_string() };
    let mesh = LocalMesh::new("my-mesh");
    mesh.add_agent(Box::new(agent)).await?;
    mesh.start().await?;
    Ok(())
}

Using LLM Agents

use enki_runtime::LlmAgent;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // Create an LLM agent with Ollama (no API key needed)
    let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
        .with_system_prompt("You are a helpful assistant.")
        .with_temperature(0.7)
        .build()?;

    // Use directly (without mesh)
    let mut ctx = enki_runtime::AgentContext::new("test".to_string(), None);
    let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
    println!("Response: {}", response);

    Ok(())
}

TOML Configuration

Load agents from TOML files:

use enki_runtime::{LlmAgent, LlmAgentFromConfig};
use enki_runtime::config::AgentConfig;

let config = AgentConfig::from_file("agent.toml")?;
let agent = LlmAgent::from_config(config)?;

Core Components

Component Description
Agent Trait for defining agent behavior
LocalMesh Local multi-agent coordination
LlmAgent Pre-built agent with LLM capabilities
Memory Trait for memory backends
InMemoryBackend In-memory storage (default)
SqliteBackend SQLite persistent storage (feature: sqlite)
RedisBackend Redis distributed storage (feature: redis)
McpClient MCP client for external tools (feature: mcp)

Examples

Run examples with:

cargo run --example llm_ollama
cargo run --example toml_agents
cargo run --example mesh_architecture
cargo run --example mcp_client --features mcp

Documentation

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Commit count: 0

cargo fmt