| Crates.io | thermogram |
| lib.rs | thermogram |
| version | 0.5.2 |
| created_at | 2026-01-07 02:49:56.563098+00 |
| updated_at | 2026-01-17 02:03:36.286177+00 |
| description | Plastic memory capsule with 4-temperature tensor states (hot/warm/cool/cold), bidirectional transitions, and hash-chained auditability |
| homepage | |
| repository | https://github.com/blackfall-labs/thermogram-rs |
| max_upload_size | |
| id | 2027360 |
| size | 308,789 |
Plastic memory capsules with 4-temperature states and embedded SNN plasticity
Thermogram is a synaptic storage system that mimics biological memory consolidation:
Biological memory systems don't have binary hot/cold states - they have gradual crystallization with bidirectional flow:
| Temperature | Analog | Decay Rate | Behavior |
|---|---|---|---|
| Hot | Working memory | Fast (0.1/tick) | Volatile, immediate task, rebuilt on boot |
| Warm | Short-term | Medium (0.01/tick) | Session learning, persists across tasks |
| Cool | Procedural/skill | Slow (0.001/tick) | Expertise, long-term mastery |
| Cold | Core identity | Glacial (0.0001/tick) | Personality backbone, constitutional |
HOT <-> WARM <-> COOL <-> COLD
| | | |
fast medium slow glacial
decay decay decay decay
reinforce() strengthens entries, promotes to colder layersweaken() or natural decay demotes to hotter layersrun_thermal_transitions() handles promotion/demotion based on thresholdsThermogram's cool layer serves as a drop-in replacement for safetensors checkpoints:
| Safetensors | Thermogram Cool Layer |
|---|---|
| Static checkpoint | Living, evolving weights |
| Full f32 precision | Optional ternary (16x smaller) |
| Load/save only | Read/write/reinforce/weaken |
| No history | Hash-chained audit trail |
| One state | 4 temperatures with transitions |
Migration path:
// Load safetensor weights
let weights: Vec<f32> = load_safetensor("model.safetensors")?;
// Import into thermogram cool layer
for (key, weight) in weights.iter().enumerate() {
let entry = ConsolidatedEntry {
key: format!("weight_{}", key),
value: bincode::serialize(&weight)?,
strength: 0.9, // High strength = stays in cool
ternary_strength: Some(TernaryWeight::from_f32(*weight, 0.3)),
updated_at: Utc::now(),
update_count: 1,
};
thermogram.cool_entries.insert(entry.key.clone(), entry);
}
// Now weights can evolve:
// - Reinforce successful patterns (may promote to cold)
// - Weaken unused patterns (may demote to warm)
// - All changes hash-chained and auditable
Each thermogram has an embedded spiking neural network (SNN) that actively shapes connections at runtime. This is NOT dead code - it's the plasticity engine:
use thermogram::{EmbeddedSNN, EmbeddedSNNConfig, NeuromodState, PlasticityEngine};
// Create SNN plasticity engine
let config = EmbeddedSNNConfig {
num_neurons: 100, // Concept prototypes
input_dim: 2048, // Activation dimension
stdp_lr: 0.01, // STDP learning rate
homeostasis_target: 0.1, // Target firing rate
competition_strength: 0.5,
decay_rate: 0.001,
use_ternary: true, // Use ternary weights
..Default::default()
};
let mut snn = EmbeddedSNN::new(config);
// Neuromodulation affects plasticity
let mut neuromod = NeuromodState::baseline();
neuromod.reward(0.3); // Dopamine spike -> increases learning rate
// Process activation vector from your model
let activation = get_layer_activations(); // e.g., from LLM hidden state
let deltas = snn.process(&activation, &neuromod)?;
// SNN generates deltas based on:
// - STDP: Cells that fire together wire together
// - Homeostasis: Prevents runaway strengthening
// - Competition: Winner-take-most (enforces sparsity)
// - Decay: Natural forgetting of unused connections
// Apply deltas to thermogram
for delta in deltas {
thermogram.apply_delta(delta)?;
}
The SNN's behavior is modulated by four chemical signals:
| Neuromodulator | Effect on Plasticity |
|---|---|
| Dopamine | Learning rate multiplier (reward signal) |
| Serotonin | Decay rate modulation (confidence) |
| Norepinephrine | Competition strength (arousal/attention) |
| Acetylcholine | Gating modulation (focus) |
let mut neuromod = NeuromodState::baseline();
// Reward -> increase dopamine -> faster learning
neuromod.reward(0.3);
// Stress -> decrease serotonin, increase NE -> faster forgetting, more competition
neuromod.stress(0.2);
// Focus -> increase acetylcholine -> sharper attention gating
neuromod.focus(0.2);
// Natural decay back to baseline
neuromod.decay(0.1);
If upgrading from thermogram 0.4.x (hot/cold only):
Hot, Warm, Cool, Cold[f32; 4] arrayswarm_entries and cool_entries added to ThermogramOld thermogram files (with only hot_entries and cold_entries) load correctly:
warm_entries and cool_entries default to empty HashMap#[serde(default)] for backward compatibility// Old 2-temp config
let old_config = ThermalConfig {
crystallization_threshold: 0.75,
min_observations: 3,
prune_threshold: 0.05,
allow_warming: true,
warming_delta: 0.3,
};
// New 4-temp config (use defaults and customize)
let mut new_config = ThermalConfig::default();
new_config.promotion_thresholds[2] = old_config.crystallization_threshold; // Cool->Cold
new_config.prune_threshold = old_config.prune_threshold;
// Or use preset configs:
let config = ThermalConfig::fast_learner(); // Faster promotion, agents
let config = ThermalConfig::organic(); // Slower, gradual emergence
[dependencies]
thermogram = "0.5"
use thermogram::{Thermogram, PlasticityRule, ThermalState};
// Create thermogram
let mut thermo = Thermogram::new("my_memory", PlasticityRule::stdp_like());
// Apply delta (learning)
let delta = Delta::update(
"concept_1",
bincode::serialize(&vec![0.5f32; 384])?,
"learning",
0.8,
thermo.dirty_chain.head_hash.clone(),
);
thermo.apply_delta(delta)?;
// Reinforce successful patterns
thermo.reinforce("concept_1", 0.2)?;
// Run thermal transitions (promotes/demotes based on strength)
thermo.run_thermal_transitions()?;
// Consolidate (dirty -> clean state)
thermo.consolidate()?;
// Save
thermo.save("memory.thermo")?;
| Operation | Time | Throughput |
|---|---|---|
| Read | 17-59ns | 17M+ ops/sec |
| Write (delta) | 660ns | 1.5M ops/sec |
| Consolidation | 17us (1000 deltas) | 60K/sec |
| SNN tick | 151us (100 neurons) | 6.6K/sec |
| Thermal transition | <1ms | 1K+/sec |
77 tests passing
v0.5.0 - 4-Temperature Architecture
MIT OR Apache-2.0