simsimd

Trait ProbabilitySimilarity

Source
pub trait ProbabilitySimilarity
where Self: Sized,
{ // Required methods fn jensenshannon(a: &[Self], b: &[Self]) -> Option<f64>; fn kullbackleibler(a: &[Self], b: &[Self]) -> Option<f64>; }
Expand description

ProbabilitySimilarity provides trait methods for computing similarity or divergence measures between probability distributions, such as the Jensen-Shannon divergence and the Kullback-Leibler divergence.

These methods are particularly useful in contexts such as information theory and machine learning, where one often needs to measure how one probability distribution differs from a second, reference probability distribution.

Required Methods§

Source

fn jensenshannon(a: &[Self], b: &[Self]) -> Option<f64>

Computes the Jensen-Shannon divergence between two probability distributions. The Jensen-Shannon divergence is a method of measuring the similarity between two probability distributions. It is based on the Kullback-Leibler divergence, but is symmetric and always has a finite value.

Source

fn kullbackleibler(a: &[Self], b: &[Self]) -> Option<f64>

Computes the Kullback-Leibler divergence between two probability distributions. The Kullback-Leibler divergence is a measure of how one probability distribution diverges from a second, expected probability distribution.

Dyn Compatibility§

This trait is not dyn compatible.

In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.

Implementations on Foreign Types§

Source§

impl ProbabilitySimilarity for f32

Source§

fn jensenshannon(a: &[Self], b: &[Self]) -> Option<f64>

Source§

fn kullbackleibler(a: &[Self], b: &[Self]) -> Option<f64>

Source§

impl ProbabilitySimilarity for f64

Source§

fn jensenshannon(a: &[Self], b: &[Self]) -> Option<f64>

Source§

fn kullbackleibler(a: &[Self], b: &[Self]) -> Option<f64>

Implementors§