| Crates.io | cachekit |
| lib.rs | cachekit |
| version | 0.2.0-alpha |
| created_at | 2026-01-14 00:59:33.396504+00 |
| updated_at | 2026-01-19 10:09:27.268946+00 |
| description | High-performance cache primitives with pluggable eviction policies (LRU, LFU, FIFO, 2Q, Clock-PRO, S3-FIFO) and optional metrics. |
| homepage | https://oxidizelabs.github.io/cachekit/ |
| repository | https://github.com/OxidizeLabs/cachekit |
| max_upload_size | |
| id | 2041866 |
| size | 2,043,602 |
High-performance cache policies and tiered caching primitives for Rust systems with optional metrics and benchmarks.
CacheKit is a Rust library that provides:
This crate is designed for systems programming, microservices, and performance-critical applications.
no_std compatibility where appropriate.Add cachekit as a dependency in your Cargo.toml:
[dependencies]
cachekit = { git = "https://github.com/OxidizeLabs/cachekit" }
The CacheBuilder provides a unified API for creating caches with any eviction policy:
use cachekit::builder::{CacheBuilder, CachePolicy};
fn main() {
// Create an LRU cache with a capacity of 100 entries
let mut cache = CacheBuilder::new(100).build::<u64, String>(CachePolicy::Lru);
// Insert items
cache.insert(1, "value1".to_string());
cache.insert(2, "value2".to_string());
// Retrieve an item
if let Some(value) = cache.get(&1) {
println!("Got from cache: {}", value);
}
// Check existence and size
assert!(cache.contains(&1));
assert_eq!(cache.len(), 2);
}
use cachekit::builder::{CacheBuilder, CachePolicy};
// FIFO - First In, First Out
let fifo = CacheBuilder::new(100).build::<u64, String>(CachePolicy::Fifo);
// LRU - Least Recently Used
let lru = CacheBuilder::new(100).build::<u64, String>(CachePolicy::Lru);
// LRU-K - Scan-resistant LRU (K=2 is common)
let lru_k = CacheBuilder::new(100).build::<u64, String>(CachePolicy::LruK { k: 2 });
// LFU - Least Frequently Used (bucket-based, O(1))
let lfu = CacheBuilder::new(100).build::<u64, String>(CachePolicy::Lfu);
// HeapLFU - Least Frequently Used (heap-based, O(log n))
let heap_lfu = CacheBuilder::new(100).build::<u64, String>(CachePolicy::HeapLfu);
// 2Q - Two-Queue with configurable probation fraction
let two_q = CacheBuilder::new(100).build::<u64, String>(
CachePolicy::TwoQ { probation_frac: 0.25 }
);
| Policy | Best For | Eviction Basis |
|---|---|---|
| FIFO | Simple, predictable workloads | Insertion order |
| LRU | Temporal locality | Recency |
| LRU-K | Scan-resistant workloads | K-th access time |
| LFU | Stable access patterns | Frequency (O(1)) |
| HeapLFU | Large caches, frequent evictions | Frequency (O(log n)) |
| 2Q | Mixed workloads | Two-queue promotion |
For advanced use cases requiring policy-specific operations, use the underlying implementations directly:
use std::sync::Arc;
use cachekit::policy::lru::LruCore;
use cachekit::traits::{CoreCache, LruCacheTrait};
fn main() {
let mut cache: LruCore<u64, &str> = LruCore::new(100);
cache.insert(1, Arc::new("value"));
// Policy-specific operations
if let Some((key, _)) = cache.peek_lru() {
println!("LRU key: {}", key);
}
}