| Crates.io | frame-thoughtchain |
| lib.rs | frame-thoughtchain |
| version | 0.2.1 |
| created_at | 2025-12-22 22:42:55.726872+00 |
| updated_at | 2026-01-08 06:53:52.274938+00 |
| description | Autonomous thought tracking and knowledge management for AI systems |
| homepage | |
| repository | https://github.com/Blackfall-Labs/frame-thoughtchain |
| max_upload_size | |
| id | 2000393 |
| size | 125,605 |
Persistent reasoning, decision tracking, and dynamic knowledge management for AI systems.
Track AI system's internal reasoning, decisions, and observations across sessions.
Temporal-aware conversation history retrieval with intelligent query detection.
Hot-reloadable knowledge archives with filesystem watching.
.eng or .db files automaticallyAdd to your Cargo.toml:
[dependencies]
frame-thoughtchain = "0.1.0"
frame-thoughtchain depends on:
frame-thoughtchain
└── frame-catalog (vector search, embeddings, database)
Used by: Frame core for autonomous reasoning
Position in Frame ecosystem:
frame-catalog
└→ frame-thoughtchain
use frame-thoughtchain::{ThoughtChainStore, ThoughtType};
use sam_memory::database::Database;
use uuid::Uuid;
// Create database and initialize schema
let db = Database::new("thoughts.db")?;
let store = ThoughtChainStore::new(&db);
store.initialize_schema()?;
// Create a session
let session_id = Uuid::new_v4();
store.create_session(session_id, None, Some("my-project"), Some("main"), None)?;
// Log a thought with embedding
let thought_id = store.log_thought(
session_id,
ThoughtType::Decision,
"Decided to use persistent storage for reliability".to_string(),
vec!["database.rs".to_string()],
&embedder,
).await?;
// Retrieve recent thoughts
let thoughts = store.get_session_thoughts(session_id, 10)?;
for thought in thoughts {
println!("{:?}: {}", thought.thought_type, thought.content);
}
// Search by semantic similarity
let query_embedding = embedder.generate("storage decisions")?;
let results = store.search_thoughts(&query_embedding, 5)?;
for (thought, score) in results {
println!("Similarity {:.3}: {}", score, thought.content);
}
use frame-thoughtchain::{is_conversation_query, search_with_temporal_context};
let query = "What did I first ask about?";
if is_conversation_query(query) {
// Route to conversation search
let events = search_with_temporal_context(
&db,
conversation_id,
query,
&embedder,
5
)?;
let context = format_conversation_context(&events);
println!("{}", context);
} else {
// Route to knowledge base search
// ...
}
use frame-thoughtchain::EngramRegistry;
// Create registry with filesystem watching
let registry = EngramRegistry::new("engrams/", true)?;
println!("Loaded {} engrams", registry.list_engrams().len());
println!("Total chunks: {}", registry.total_chunks());
// Search across all engrams
let query_embedding = embedder.generate("How do I use async Rust?")?;
let results = registry.search_all(&query_embedding, 10)?;
for result in results {
println!("[{}] {:.3}: {}",
result.engram_id,
result.score,
result.content
);
}
// Add new .eng file to directory → automatically loaded!
ThoughtChainStoreCore Methods:
new(database: &Database) - Create storeinitialize_schema() - Create tables and indicescreate_session() - Start new sessionend_session() - Mark session completelog_thought() - Store thought with embeddingget_session_thoughts() - Retrieve thoughts for sessionget_recent_thoughts() - Get recent thoughts across all sessionssearch_thoughts() - Semantic similarity searchMigration:
migrate_add_opcode_column() - Add opcode support to existing DBThoughtType Enumpub enum ThoughtType {
Reasoning, // Internal analysis
Decision, // Choice made + rationale
Reflection, // Looking back on past events
Observation, // Pattern noticed about user/system
Question, // Uncertainty or open question
Milestone, // Completion or achievement
}
ThoughtEntry Structpub struct ThoughtEntry {
pub id: Uuid,
pub session_id: Uuid,
pub timestamp: DateTime<Utc>,
pub thought_type: ThoughtType,
pub content: String,
pub context: Vec<String>, // Related files/references
pub metadata: Option<Value>, // JSON metadata
pub opcode_data: Option<Vec<u8>>, // Binary decision encoding
}
is_conversation_query(query: &str) -> bool - Detect conversation referencesdetect_temporal_context(query: &str) -> TemporalContext - Extract time contextsearch_with_temporal_context() - Combined temporal + semantic searchget_first_user_message() - Get conversation startget_recent_messages() - Get last N messagesformat_conversation_context() - Export for context injectionEngramRegistrySetup:
new(directory, watch) - Create registry with optional watchingload_all() - Manually reload all engramsQuerying:
search_all(embedding, limit) - Query all engramslist_engrams() - Get loaded engram IDsget_engram_info(id) - Get metadata for specific engramtotal_chunks() - Total chunks across all engramsTrack reasoning across sessions to maintain consistency:
// Session 1: User asks about database choice
store.log_thought(
session_id,
ThoughtType::Decision,
"Recommended SQLite for portability and zero-config deployment",
vec!["database-choice"],
&embedder
).await?;
// Session 2: User asks why we chose SQLite
let query_embedding = embedder.generate("database recommendation")?;
let thoughts = store.search_thoughts(&query_embedding, 1)?;
// Returns: "Recommended SQLite for portability..."
Handle temporal queries naturally:
// "What was the first thing I asked?"
let first_msg = get_first_user_message(&db, conversation_id)?;
// "What did we talk about earlier?"
let context = search_with_temporal_context(
&db,
conversation_id,
"What did we discuss earlier?",
&embedder,
5
)?;
Update knowledge without restarting:
# Application running with engram registry watching "engrams/"
cp new-rust-docs.eng engrams/
# → Automatically detected and loaded
# → Immediately available for queries
Track system achievements:
store.log_thought(
session_id,
ThoughtType::Milestone,
"Completed full Frame microservices extraction into 7 focused crates",
vec!["sam-utils", "frame-thoughtchain", "sam-vector", ...],
&embedder
).await?;
ThoughtChain uses SQLite with the following tables:
Indices on (session_id, timestamp) and (thought_type, timestamp).
Core:
rusqlite (0.31) - SQLite with FTS5serde, serde_json - Serializationchrono - Timestampsuuid - Unique identifiersEngram Support:
notify (6.0) - Filesystem watchingcml - Content Markup Language with embeddingsTemporary (until sam-vector extraction):
sam-memory - Database and embedding generator traitssam-vector crate (removes sam-memory dependency)sam-opcode crate)Extracted from the Frame project, where it provides persistent reasoning and knowledge management for the AI assistant.
MIT - See LICENSE for details.
Magnus Trent magnus@blackfall.dev