| Crates.io | llm-sentinel-storage |
| lib.rs | llm-sentinel-storage |
| version | 0.1.0 |
| created_at | 2025-11-06 07:59:02.763375+00 |
| updated_at | 2025-11-06 07:59:02.763375+00 |
| description | InfluxDB time-series storage and multi-layer caching (Moka + Redis) for LLM-Sentinel |
| homepage | https://github.com/globalbusinessadvisors/llm-sentinel |
| repository | https://github.com/globalbusinessadvisors/llm-sentinel |
| max_upload_size | |
| id | 1919271 |
| size | 104,217 |
InfluxDB time-series storage and multi-layer caching for LLM-Sentinel.
High-performance storage layer with multiple caching tiers:
Add this to your Cargo.toml:
[dependencies]
llm-llm-sentinel-storage = "0.1.0"
use llm_sentinel_storage::{StorageManager, StorageConfig};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = StorageConfig {
influxdb_url: "http://localhost:8086".to_string(),
influxdb_org: "sentinel".to_string(),
influxdb_token: std::env::var("INFLUXDB_TOKEN")?,
telemetry_bucket: "telemetry".to_string(),
..Default::default()
};
let manager = StorageManager::new(config).await?;
// Store telemetry
manager.store_telemetry(&event).await?;
// Query historical data
let results = manager.query_telemetry(
"service_name = 'chat-api'",
chrono::Duration::hours(24)
).await?;
Ok(())
}
Apache-2.0