| Crates.io | llm-brain |
| lib.rs | llm-brain |
| version | 0.0.0 |
| created_at | 2025-04-27 16:47:49.152994+00 |
| updated_at | 2025-04-27 16:47:49.152994+00 |
| description | A memory layer with ConceptNet integration and LLM-friendly interfaces, rewritten in Rust using SurrealDB. |
| homepage | |
| repository | https://github.com/xhwhis/llm-brain |
| max_upload_size | |
| id | 1651293 |
| size | 362,043 |
LLMBrain is a Rust library providing a foundational memory layer inspired by neuroscience concepts, particularly suited for building Retrieval-Augmented Generation (RAG) systems and other AI applications requiring structured memory.
It focuses on core functionalities for storing, managing, and recalling information fragments, leveraging vector embeddings for semantic search.
LLMBrain Struct: The main entry point for interacting with the memory system. It manages configuration, storage, embedding generation, and recall operations.MemoryFragment: The basic unit of information stored, containing content (text), metadata (a flexible JSON value for properties, relationships, types, etc.), and an embedding vector.OpenAiClient and feature flags) to generate text embeddings for semantic recall.recall() method for retrieving relevant MemoryFragments based on semantic similarity to a query string.ConceptNetClient for optional knowledge enrichment (requires separate setup).Add llm-brain as a dependency in your Cargo.toml:
[dependencies]
llm-brain = { path = "/path/to/llm-brain/crate" } # Or use version = "x.y.z" if published
# Enable features as needed, e.g., for SurrealDB storage and OpenAI embeddings:
# llm-brain = { path = "...", features = ["storage-surrealdb", "llm-openai"] }
use llm-brain::{LLMBrain, Result};
use serde_json::json;
#[tokio::main]
async fn main() -> Result<()> {
// Ensure configuration is set up (e.g., ./config/default.toml)
// pointing to a database and potentially LLM API keys.
// See tests/common/mod.rs for setup examples.
println!("Launching LLMBrain...");
let llm-brain = LLMBrain::launch().await?;
println!("LLMBrain launched successfully.");
// Add a memory fragment
let content = "Rust is a systems programming language focused on safety and performance.".to_string();
let metadata = json!({
"entity_name": "Rust_Language",
"memory_type": "Semantic",
"properties": {
"type": "Programming Language",
"paradigm": ["Systems", "Functional", "Concurrent"],
"key_features": ["Ownership", "Borrowing", "Concurrency"]
}
});
println!("Adding memory...");
let memory_id = llm-brain.add_memory(content.clone(), metadata).await?;
println!("Memory added with ID: {}", memory_id);
// Recall relevant information
let query = "Tell me about Rust safety features";
println!("Recalling memories for query: '{}'...", query);
let results = llm-brain.recall(query, 1).await?;
if let Some((fragment, score)) = results.first() {
println!("\nTop result (Score: {:.4}):", score);
println!("Content: {}", fragment.content);
println!("Metadata: {:#}", fragment.metadata);
} else {
println!("No relevant memories found.");
}
Ok(())
}
cargo buildcargo check (or cargo check --tests)cargo test (use cargo test -- --ignored --nocapture to run all tests, requires setup like OPENAI_API_KEY)cargo fmt and cargo clippyThis library is licensed under the Unlicense License. See the root LICENSE file for details.