logp 0.1.0

Information theory primitives: entropy, KL divergence, mutual information (KSG estimator), and information-monotone divergences
Documentation
  • Coverage
  • 67.39%
    31 out of 46 items documented0 out of 28 items with examples
  • Size
  • Source code size: 57.49 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 3.78 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 27s Average build duration of successful builds.
  • all releases: 27s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • Homepage
  • arclabs561/logp
    0 0 0
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • arclabs561

logp

Information theory primitives: entropies and divergences.

Dual-licensed under MIT or Apache-2.0.

crates.io | docs.rs

use logp::{entropy_nats, kl_divergence, jensen_shannon_divergence};

let p = [0.1, 0.9];
let q = [0.9, 0.1];

// Shannon entropy in nats
let h = entropy_nats(&p, 1e-9).unwrap();

// Relative entropy (KL)
let kl = kl_divergence(&p, &q, 1e-9).unwrap();

// Symmetric, bounded Jensen-Shannon
let js = jensen_shannon_divergence(&p, &q, 1e-9).unwrap();

Taxonomy of Divergences

Family Generator Key Property
f-divergences Convex $f(t)$ with $f(1)=0$ Monotone under Markov morphisms (coarse-graining)
Bregman Convex $F(x)$ Dually flat geometry; generalized Pythagorean theorem
Jensen-Shannon $f$-div + metric Symmetric, bounded $[0, \ln 2]$, $\sqrt{JS}$ is a metric
Alpha $\rho_\alpha = \int p\alpha q{1-\alpha}$ Encodes Rényi, Tsallis, Bhattacharyya, Hellinger

Connections

  • rkhs: MMD and KL both measure distribution "distance"
  • wass: Wasserstein vs entropy-based divergences
  • fynch: Temperature scaling affects entropy calibration