mojentic

Crates.iomojentic
lib.rsmojentic
version1.0.0
created_at2025-11-28 02:30:43.588593+00
updated_at2025-11-28 02:30:43.588593+00
descriptionAn LLM integration framework for Rust
homepagehttps://svetzal.github.io/mojentic-ru
repositoryhttps://github.com/svetzal/mojentic-ru
max_upload_size
id1954711
size2,656,852
Stacey Vetzal (svetzal)

documentation

https://svetzal.github.io/mojentic-ru/api/mojentic

README

Mojentic

Crates.io License: MIT Rust

A modern LLM integration framework for Rust with full feature parity across Python, Elixir, and TypeScript implementations.

Mojentic provides a clean abstraction over multiple LLM providers with tool support, structured output generation, streaming, and a complete event-driven agent system.

๐Ÿš€ Features

  • ๐Ÿ”Œ Multi-Provider Support: OpenAI and Ollama gateways
  • ๐Ÿ”’ Type-Safe: Leverages Rust's type system for safe LLM interactions
  • โšก Async-First: Built on Tokio for efficient async operations
  • ๐Ÿ› ๏ธ Tool System: Extensible tool calling with automatic recursive execution
  • ๐Ÿ“Š Structured Output: Generate type-safe structured data with serde
  • ๐ŸŒŠ Streaming: Async streaming with Pin<Box<dyn Stream>>
  • ๐Ÿ” Tracer System: Complete observability for debugging and monitoring
  • ๐Ÿค– Agent System: Event-driven multi-agent coordination with ReAct pattern
  • ๐Ÿ“ฆ 24 Examples: Comprehensive examples demonstrating all features

๐Ÿ“ฆ Installation

Add Mojentic to your Cargo.toml:

[dependencies]
mojentic = "1.0.0"
tokio = { version = "1", features = ["full"] }

๐Ÿ”ง Prerequisites

To use Mojentic with local models, you need Ollama installed and running:

  1. Install Ollama from ollama.ai
  2. Pull a model: ollama pull qwen3:32b
  3. Verify it's running: ollama list

๐ŸŽฏ Quick Start

use mojentic::prelude::*;
use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<()> {
    let gateway = Arc::new(OllamaGateway::new());
    let broker = LlmBroker::new("qwen3:32b", gateway, None);

    let messages = vec![
        LlmMessage::user("What is the capital of France?"),
    ];

    let response = broker.generate(&messages, None, None).await?;
    println!("Response: {}", response);

    Ok(())
}

๐Ÿ“š Documentation

Build documentation locally:

# Build API docs
cargo doc --no-deps --all-features

# Build the mdBook
mdbook build book
open book/book/index.html

๐Ÿ—๏ธ Architecture

Mojentic is structured in three layers:

Layer 1: LLM Integration

  • LlmBroker - Main interface for LLM interactions
  • LlmGateway trait - Abstract interface for LLM providers
  • OllamaGateway / OpenAiGateway - Provider implementations
  • ChatSession - Conversational session management
  • TokenizerGateway - Token counting with tiktoken-rs
  • EmbeddingsGateway - Vector embeddings
  • Comprehensive tool system with 10+ built-in tools

Layer 2: Tracer System

  • TracerSystem - Thread-safe event recording
  • EventStore - Event persistence and querying
  • Correlation ID tracking with Arc sharing
  • LLM call, response, and tool events

Layer 3: Agent System

  • AsyncDispatcher - Event routing
  • Router - Event-to-agent routing
  • BaseLlmAgent / AsyncLlmAgent - LLM-powered agents
  • IterativeProblemSolver - Multi-step reasoning
  • SimpleRecursiveAgent - Self-recursive processing
  • SharedWorkingMemory - Agent context sharing with RwLock
  • ReAct pattern implementation

๐Ÿงช Examples

cargo run --example simple_llm
cargo run --example structured_output
cargo run --example tool_usage
cargo run --example streaming
cargo run --example chat_session
cargo run --example tracer_demo
cargo run --example async_llm
cargo run --example iterative_solver

๐Ÿ”ง Development

# Build
cargo build

# Test
cargo test

# Format
cargo fmt

# Lint
cargo clippy --all-targets --all-features -- -D warnings

# Security audit
cargo deny check

๐Ÿ“„ License

MIT License - see LICENSE

Credits

Mojentic is a Mojility product by Stacey Vetzal.

Commit count: 0

cargo fmt