logp
Information theory primitives: entropies and divergences.
Dual-licensed under MIT or Apache-2.0.
use ;
let p = ;
let q = ;
// Shannon entropy in nats
let h = entropy_nats.unwrap;
// Relative entropy (KL)
let kl = kl_divergence.unwrap;
// Symmetric, bounded Jensen-Shannon
let js = jensen_shannon_divergence.unwrap;
Taxonomy of Divergences
| Family | Generator | Key Property |
|---|---|---|
| f-divergences | Convex $f(t)$ with $f(1)=0$ | Monotone under Markov morphisms (coarse-graining) |
| Bregman | Convex $F(x)$ | Dually flat geometry; generalized Pythagorean theorem |
| Jensen-Shannon | $f$-div + metric | Symmetric, bounded $[0, \ln 2]$, $\sqrt{JS}$ is a metric |
| Alpha | $\rho_\alpha = \int p\alpha q{1-\alpha}$ | Encodes Rényi, Tsallis, Bhattacharyya, Hellinger |