| Crates.io | umi-memory |
| lib.rs | umi-memory |
| version | 0.1.0 |
| created_at | 2026-01-12 20:50:48.455901+00 |
| updated_at | 2026-01-12 20:50:48.455901+00 |
| description | Memory library for AI agents with deterministic simulation testing |
| homepage | https://github.com/rita-aga/umi |
| repository | https://github.com/rita-aga/umi |
| max_upload_size | |
| id | 2038835 |
| size | 1,021,592 |
A production-ready memory library for AI agents with deterministic simulation testing.
[dependencies]
umi-memory = "0.1"
use umi_memory::umi::{Memory, RememberOptions, RecallOptions};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create memory with simulation providers (deterministic, seed 42)
let mut memory = Memory::sim(42);
// Remember information
memory.remember(
"Alice is a software engineer at Acme Corp",
RememberOptions::default()
).await?;
// Recall information
let results = memory.recall("Who works at Acme?", RecallOptions::default()).await?;
for entity in results {
println!("Found: {} - {}", entity.name, entity.content);
}
Ok(())
}
use umi_memory::umi::{Memory, MemoryBuilder, MemoryConfig};
use umi_memory::llm::AnthropicProvider;
use umi_memory::embedding::OpenAIEmbedding;
use umi_memory::storage::LanceStorageBackend;
use std::time::Duration;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Configure production memory settings
let config = MemoryConfig::default()
.with_core_memory_bytes(128 * 1024) // 128 KB
.with_working_memory_bytes(10 * 1024 * 1024) // 10 MB
.with_working_memory_ttl(Duration::from_secs(3600 * 4)) // 4 hours
.with_recall_limit(50);
// Create with real providers
let llm = AnthropicProvider::new(std::env::var("ANTHROPIC_API_KEY")?);
let embedder = OpenAIEmbedding::new(std::env::var("OPENAI_API_KEY")?);
let storage = LanceStorageBackend::connect("./lance_db").await?;
let mut memory = MemoryBuilder::new()
.with_llm(llm)
.with_embedder(embedder)
.with_storage(storage)
.with_config(config)
.build();
// Use memory...
Ok(())
}
lance - LanceDB storage backend (persistent, embedded vector database)postgres - PostgreSQL storage backend (persistent, external database)anthropic - Anthropic LLM provider (Claude)openai - OpenAI LLM provider (GPT, embeddings)llm-providers - All LLM providers (convenience flag)embedding-providers - All embedding providers (convenience flag)[dependencies]
umi-memory = { version = "0.1", features = ["lance", "anthropic", "openai"] }
Umi follows TigerStyle principles:
debug_assert! validates invariantsSee the examples/ directory:
quick_start.rs - Basic remember/recall workflowproduction_setup.rs - Production configuration with real providersconfiguration.rs - Customize memory behaviortest_anthropic.rs - Test Anthropic integration (requires API key)test_openai.rs - Test OpenAI integration (requires API key)Run examples:
cargo run --example quick_start
cargo run --example production_setup --features lance,anthropic
Umi has comprehensive test coverage:
# Run all tests
cargo test -p umi-memory --all-features
# Run fault injection tests
cargo test -p umi-memory --lib dst_tests
# Run benchmarks
cargo bench -p umi-memory
Every component MUST have a simulation implementation:
use umi_memory::{Memory, SimLLMProvider, SimStorageBackend, SimConfig};
// Deterministic - same seed = same results
let config = SimConfig::with_seed(42);
let memory = Memory::sim(42);
Why? Same seed = same results = reproducible tests and bugs.
Components handle failures gracefully:
Use debug_assert! for invariants:
fn store(&mut self, data: &[u8]) -> Result<()> {
debug_assert!(!data.is_empty(), "data must not be empty");
debug_assert!(data.len() <= self.capacity, "data exceeds capacity");
// ...
}
Python bindings are available but experimental. See PYTHON.md for details.
from umi import Memory
memory = Memory.sim(42)
memory.remember("Alice is an engineer")
results = memory.recall("Who is Alice?")
Status: Basic functionality works, but incomplete type system. Rust API is primary.
Contributions welcome! Please:
cargo test --all-featuresMIT License - see LICENSE for details.
Umi was extracted from RikaiOS to be a standalone library. Inspired by:
Umi follows semantic versioning (SemVer):
Breaking changes are expected in 0.x versions.