| Crates.io | openmemory |
| lib.rs | openmemory |
| version | 0.1.1 |
| created_at | 2025-12-16 06:01:25.523958+00 |
| updated_at | 2025-12-16 06:07:34.08449+00 |
| description | OpenMemory - Cognitive memory system for AI applications |
| homepage | |
| repository | https://github.com/honeymaro/openmemory-rs |
| max_upload_size | |
| id | 1987293 |
| size | 317,815 |
Report Bug • Request Feature Local-first long-term memory engine for AI apps and agents. Self-hosted. Explainable. Scalable.
A Rust port of the OpenMemory JavaScript SDK with native performance.
Add to your Cargo.toml:
[dependencies]
openmemory = "0.1"
tokio = { version = "1", features = ["full"] }
use openmemory::{OpenMemory, AddOptions, QueryOptions};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let mem = OpenMemory::new(None).await?;
// Add a memory
let result = mem.add(
"I'm building a web app with OpenMemory",
AddOptions::default()
).await?;
println!("Added memory: {}", result.id);
// Query memories
let results = mem.query("What am I building?", QueryOptions::default()).await?;
for r in results {
println!("[{:.2}] {}", r.score, r.content);
}
Ok(())
}
That's it. You're now running a fully local cognitive memory engine 🎉
✅ Local-first - Runs entirely on your machine, zero external dependencies ✅ Multi-sector memory - Episodic, Semantic, Procedural, Emotional, Reflective ✅ Memory decay - Adaptive forgetting with sector-specific rates ✅ Waypoint graph - Associative recall paths via BFS expansion ✅ Hybrid search - Vector similarity + keyword filtering ✅ Zero config - Works out of the box with sensible defaults ✅ Native performance - Rust-powered speed and memory safety
use openmemory::{OpenMemory, Config, EmbeddingKind, Tier};
use std::path::PathBuf;
let config = Config::builder()
.db_path(PathBuf::from("./data/memory.db"))
.tier(Tier::Smart)
.embedding_kind(EmbeddingKind::Synthetic)
.build();
let mem = OpenMemory::new(Some(config)).await?;
let config = Config::builder()
.embedding_kind(EmbeddingKind::Synthetic)
.build();
let config = Config::builder()
.embedding_kind(EmbeddingKind::OpenAI)
.openai_key("sk-...".to_string())
.openai_model("text-embedding-3-small".to_string())
.build();
let config = Config::builder()
.embedding_kind(EmbeddingKind::Gemini)
.gemini_key("your-api-key".to_string())
.build();
let config = Config::builder()
.embedding_kind(EmbeddingKind::Ollama)
.ollama_url("http://localhost:11434".to_string())
.ollama_model("llama3".to_string())
.build();
Enable the aws feature in Cargo.toml:
[dependencies]
openmemory = { version = "0.1", features = ["aws"] }
let config = Config::builder()
.embedding_kind(EmbeddingKind::Bedrock)
.build();
// Uses AWS credentials from environment or ~/.aws/credentials
| Tier | Vector Dim | Description |
|---|---|---|
Fast |
256 | Optimized for speed, lower precision |
Smart |
384 | Balanced performance and accuracy (default) |
Deep |
1536 | Maximum accuracy, slower |
Hybrid |
384 | Adaptive with keyword filtering |
use openmemory::Tier;
let config = Config::builder()
.tier(Tier::Hybrid)
.build();
add(content, options)Store a new memory.
use openmemory::AddOptions;
let result = mem.add(
"User prefers dark mode",
AddOptions {
tags: Some(vec!["preference".to_string(), "ui".to_string()]),
salience: Some(0.8),
..Default::default()
}
).await?;
println!("ID: {}", result.id);
println!("Sector: {:?}", result.primary_sector);
query(query, options)Search for relevant memories using the HSG (Hybrid Similarity Graph) algorithm.
use openmemory::QueryOptions;
let results = mem.query(
"user preferences",
QueryOptions {
k: 10,
min_salience: Some(0.5),
..Default::default()
}
).await?;
for r in results {
println!("[{:.3}] {} - {:?}", r.score, r.content, r.primary_sector);
}
get_all(limit, offset)Retrieve all memories with pagination.
let memories = mem.get_all(100, 0).await?;
println!("Total memories: {}", memories.len());
delete(id)Remove a memory by ID.
mem.delete(&memory_id).await?;
reinforce(id, boost)Boost a memory's salience score.
mem.reinforce(&memory_id, 0.2).await?;
run_decay()Process memory decay based on time elapsed.
let stats = mem.run_decay().await?;
println!("Processed: {}, Decayed: {}", stats.processed, stats.decayed);
OpenMemory automatically classifies content into 5 cognitive sectors:
| Sector | Description | Examples | Decay Rate |
|---|---|---|---|
| Episodic | Time-bound events & experiences | "Yesterday I attended a conference" | Medium (0.015) |
| Semantic | Timeless facts & knowledge | "Paris is the capital of France" | Very Low (0.005) |
| Procedural | Skills, procedures, how-tos | "To deploy: build, test, push" | Low (0.008) |
| Emotional | Feelings, sentiment, mood | "I'm excited about this project!" | High (0.02) |
| Reflective | Meta-cognition, insights | "I learn best through practice" | Very Low (0.001) |
use openmemory::Sector;
// Query specific sectors
let results = mem.query(
"how to deploy",
QueryOptions {
sectors: Some(vec![Sector::Procedural]),
..Default::default()
}
).await?;
The Hybrid Similarity Graph (HSG) combines multiple signals for retrieval:
final_score = sigmoid(
0.40 × vector_similarity +
0.20 × token_overlap +
0.15 × waypoint_weight +
0.15 × recency_score +
0.10 × tag_match +
keyword_boost (Hybrid tier)
)
Features:
| Variable | Description | Default |
|---|---|---|
OM_DB_PATH |
Database file path | :memory: |
OM_TIER |
Performance tier | smart |
OM_EMBEDDING |
Embedding provider | synthetic |
OM_VEC_DIM |
Vector dimensions | Tier default |
OPENAI_API_KEY |
OpenAI API key | - |
GEMINI_API_KEY |
Gemini API key | - |
Run the basic usage example:
cd openmemory-rs
cargo run --example basic_usage
Benchmarks on Apple M1:
| Operation | Time |
|---|---|
| Synthetic embed | ~0.5ms |
| Add memory | ~2ms |
| Query (1k memories) | ~15ms |
| Decay batch (1k) | ~50ms |
Run benchmarks:
cargo bench
| Feature | Description |
|---|---|
aws |
Enable AWS Bedrock embedding provider |
[dependencies]
openmemory = { version = "0.1", features = ["aws"] }
Rust 1.70 or later.
MIT License - see LICENSE for details.
Contributions are welcome! Please read our contributing guidelines before submitting PRs.
# Run tests
cargo test
# Run clippy
cargo clippy
# Format code
cargo fmt