| Crates.io | logp |
| lib.rs | logp |
| version | 0.1.0 |
| created_at | 2026-01-18 14:53:39.225739+00 |
| updated_at | 2026-01-18 14:53:39.225739+00 |
| description | Information theory primitives: entropy, KL divergence, mutual information (KSG estimator), and information-monotone divergences |
| homepage | https://github.com/arclabs561/logp |
| repository | https://github.com/arclabs561/logp |
| max_upload_size | |
| id | 2052437 |
| size | 58,001 |
Information theory primitives: entropies and divergences.
Dual-licensed under MIT or Apache-2.0.
use logp::{entropy_nats, kl_divergence, jensen_shannon_divergence};
let p = [0.1, 0.9];
let q = [0.9, 0.1];
// Shannon entropy in nats
let h = entropy_nats(&p, 1e-9).unwrap();
// Relative entropy (KL)
let kl = kl_divergence(&p, &q, 1e-9).unwrap();
// Symmetric, bounded Jensen-Shannon
let js = jensen_shannon_divergence(&p, &q, 1e-9).unwrap();
| Family | Generator | Key Property |
|---|---|---|
| f-divergences | Convex $f(t)$ with $f(1)=0$ | Monotone under Markov morphisms (coarse-graining) |
| Bregman | Convex $F(x)$ | Dually flat geometry; generalized Pythagorean theorem |
| Jensen-Shannon | $f$-div + metric | Symmetric, bounded $[0, \ln 2]$, $\sqrt{JS}$ is a metric |
| Alpha | $\rho_\alpha = \int p^\alpha q^{1-\alpha}$ | Encodes Rényi, Tsallis, Bhattacharyya, Hellinger |