openmemory

Crates.ioopenmemory
lib.rsopenmemory
version0.1.1
created_at2025-12-16 06:01:25.523958+00
updated_at2025-12-16 06:07:34.08449+00
descriptionOpenMemory - Cognitive memory system for AI applications
homepage
repositoryhttps://github.com/honeymaro/openmemory-rs
max_upload_size
id1987293
size317,815
maro (honeymaro)

documentation

README

OpenMemory Rust SDK

Crates.io Documentation License: MIT

Report BugRequest Feature Local-first long-term memory engine for AI apps and agents. Self-hosted. Explainable. Scalable.

A Rust port of the OpenMemory JavaScript SDK with native performance.


Quick Start

Add to your Cargo.toml:

[dependencies]
openmemory = "0.1"
tokio = { version = "1", features = ["full"] }
use openmemory::{OpenMemory, AddOptions, QueryOptions};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let mem = OpenMemory::new(None).await?;

    // Add a memory
    let result = mem.add(
        "I'm building a web app with OpenMemory",
        AddOptions::default()
    ).await?;
    println!("Added memory: {}", result.id);

    // Query memories
    let results = mem.query("What am I building?", QueryOptions::default()).await?;
    for r in results {
        println!("[{:.2}] {}", r.score, r.content);
    }

    Ok(())
}

That's it. You're now running a fully local cognitive memory engine 🎉


Features

Local-first - Runs entirely on your machine, zero external dependencies ✅ Multi-sector memory - Episodic, Semantic, Procedural, Emotional, Reflective ✅ Memory decay - Adaptive forgetting with sector-specific rates ✅ Waypoint graph - Associative recall paths via BFS expansion ✅ Hybrid search - Vector similarity + keyword filtering ✅ Zero config - Works out of the box with sensible defaults ✅ Native performance - Rust-powered speed and memory safety


Configuration

Basic Configuration

use openmemory::{OpenMemory, Config, EmbeddingKind, Tier};
use std::path::PathBuf;

let config = Config::builder()
    .db_path(PathBuf::from("./data/memory.db"))
    .tier(Tier::Smart)
    .embedding_kind(EmbeddingKind::Synthetic)
    .build();

let mem = OpenMemory::new(Some(config)).await?;

Embedding Providers

Synthetic (Testing/Development)

let config = Config::builder()
    .embedding_kind(EmbeddingKind::Synthetic)
    .build();

OpenAI (Recommended for Production)

let config = Config::builder()
    .embedding_kind(EmbeddingKind::OpenAI)
    .openai_key("sk-...".to_string())
    .openai_model("text-embedding-3-small".to_string())
    .build();

Gemini

let config = Config::builder()
    .embedding_kind(EmbeddingKind::Gemini)
    .gemini_key("your-api-key".to_string())
    .build();

Ollama (Fully Local)

let config = Config::builder()
    .embedding_kind(EmbeddingKind::Ollama)
    .ollama_url("http://localhost:11434".to_string())
    .ollama_model("llama3".to_string())
    .build();

AWS Bedrock

Enable the aws feature in Cargo.toml:

[dependencies]
openmemory = { version = "0.1", features = ["aws"] }
let config = Config::builder()
    .embedding_kind(EmbeddingKind::Bedrock)
    .build();
// Uses AWS credentials from environment or ~/.aws/credentials

Performance Tiers

Tier Vector Dim Description
Fast 256 Optimized for speed, lower precision
Smart 384 Balanced performance and accuracy (default)
Deep 1536 Maximum accuracy, slower
Hybrid 384 Adaptive with keyword filtering
use openmemory::Tier;

let config = Config::builder()
    .tier(Tier::Hybrid)
    .build();

API Reference

add(content, options)

Store a new memory.

use openmemory::AddOptions;

let result = mem.add(
    "User prefers dark mode",
    AddOptions {
        tags: Some(vec!["preference".to_string(), "ui".to_string()]),
        salience: Some(0.8),
        ..Default::default()
    }
).await?;

println!("ID: {}", result.id);
println!("Sector: {:?}", result.primary_sector);

query(query, options)

Search for relevant memories using the HSG (Hybrid Similarity Graph) algorithm.

use openmemory::QueryOptions;

let results = mem.query(
    "user preferences",
    QueryOptions {
        k: 10,
        min_salience: Some(0.5),
        ..Default::default()
    }
).await?;

for r in results {
    println!("[{:.3}] {} - {:?}", r.score, r.content, r.primary_sector);
}

get_all(limit, offset)

Retrieve all memories with pagination.

let memories = mem.get_all(100, 0).await?;
println!("Total memories: {}", memories.len());

delete(id)

Remove a memory by ID.

mem.delete(&memory_id).await?;

reinforce(id, boost)

Boost a memory's salience score.

mem.reinforce(&memory_id, 0.2).await?;

run_decay()

Process memory decay based on time elapsed.

let stats = mem.run_decay().await?;
println!("Processed: {}, Decayed: {}", stats.processed, stats.decayed);

Cognitive Sectors

OpenMemory automatically classifies content into 5 cognitive sectors:

Sector Description Examples Decay Rate
Episodic Time-bound events & experiences "Yesterday I attended a conference" Medium (0.015)
Semantic Timeless facts & knowledge "Paris is the capital of France" Very Low (0.005)
Procedural Skills, procedures, how-tos "To deploy: build, test, push" Low (0.008)
Emotional Feelings, sentiment, mood "I'm excited about this project!" High (0.02)
Reflective Meta-cognition, insights "I learn best through practice" Very Low (0.001)
use openmemory::Sector;

// Query specific sectors
let results = mem.query(
    "how to deploy",
    QueryOptions {
        sectors: Some(vec![Sector::Procedural]),
        ..Default::default()
    }
).await?;

HSG Query Algorithm

The Hybrid Similarity Graph (HSG) combines multiple signals for retrieval:

final_score = sigmoid(
    0.40 × vector_similarity +
    0.20 × token_overlap +
    0.15 × waypoint_weight +
    0.15 × recency_score +
    0.10 × tag_match +
    keyword_boost (Hybrid tier)
)

Features:

  • Sector penalties for cross-sector retrieval
  • BFS waypoint expansion for associative recall
  • Feedback learning with EMA score updates

Environment Variables

Variable Description Default
OM_DB_PATH Database file path :memory:
OM_TIER Performance tier smart
OM_EMBEDDING Embedding provider synthetic
OM_VEC_DIM Vector dimensions Tier default
OPENAI_API_KEY OpenAI API key -
GEMINI_API_KEY Gemini API key -

Examples

Run the basic usage example:

cd openmemory-rs
cargo run --example basic_usage

Performance

Benchmarks on Apple M1:

Operation Time
Synthetic embed ~0.5ms
Add memory ~2ms
Query (1k memories) ~15ms
Decay batch (1k) ~50ms

Run benchmarks:

cargo bench

Feature Flags

Feature Description
aws Enable AWS Bedrock embedding provider
[dependencies]
openmemory = { version = "0.1", features = ["aws"] }

Minimum Supported Rust Version

Rust 1.70 or later.


License

MIT License - see LICENSE for details.


Contributing

Contributions are welcome! Please read our contributing guidelines before submitting PRs.

# Run tests
cargo test

# Run clippy
cargo clippy

# Format code
cargo fmt
Commit count: 0

cargo fmt