Module bio::stats::hmm[][src]

Expand description

An implementation of Hidden Markov Models in Rust.

Examples

Discrete Emission Distribution

We construct the example from Borodovsky & Ekisheva (2006), pp. 80 (also see these slides.

use approx::assert_relative_eq;
use bio::stats::hmm::discrete_emission::Model as DiscreteEmissionHMM;
use bio::stats::hmm::viterbi;
use bio::stats::Prob;
use ndarray::array;

let transition = array![[0.5, 0.5], [0.4, 0.6]];
let observation = array![[0.2, 0.3, 0.3, 0.2], [0.3, 0.2, 0.2, 0.3]];
let initial = array![0.5, 0.5];

let hmm = DiscreteEmissionHMM::with_float(&transition, &observation, &initial)
    .expect("Dimensions should be consistent");
let (path, log_prob) = viterbi(&hmm, &vec![2, 2, 1, 0, 1, 3, 2, 0, 0]);
let prob = Prob::from(log_prob);
assert_relative_eq!(4.25e-8_f64, *prob, epsilon = 1e-9_f64);

Continuous (Gaussian) Emission Distribution

use approx::assert_relative_eq;
use bio::stats::hmm::univariate_continuous_emission::GaussianModel as GaussianHMM;
use bio::stats::hmm::viterbi;
use bio::stats::Prob;
use ndarray::array;
use statrs::distribution::Normal;

let transition = array![[0.5, 0.5], [0.4, 0.6]];
let observation = vec![
    Normal::new(0.0, 1.0).unwrap(),
    Normal::new(2.0, 1.0).unwrap(),
];
let initial = array![0.5, 0.5];

let hmm = GaussianHMM::with_float(&transition, observation, &initial)
    .expect("Dimensions should be consistent");
let (path, log_prob) = viterbi(
    &hmm,
    &vec![-0.1, 0.1, -0.2, 0.5, 0.8, 1.1, 1.2, 1.5, 0.5, 0.2],
);
let prob = Prob::from(log_prob);
assert_relative_eq!(2.64e-8_f64, *prob, epsilon = 1e-9_f64);

Trainning a Discrete Emission Model with Baum-Welch algorithm

We construct the example from Jason Eisner lecture which can be followed along with his spreadsheet (link). Take a look at tests in source file.

Numeric Stability

The implementation uses log-scale probabilities for numeric stability.

Limitations

Currently, only discrete and single-variate Gaussian continuous HMMs are implemented. Also, only dense transition matrices and trainning of discrete models are supported.

References

  • Rabiner, Lawrence R. “A tutorial on hidden Markov models and selected applications in speech recognition.” Proceedings of the IEEE 77, no. 2 (1989): 257-286.
  • Eisner, Jason “An interactive spreadsheet for teaching the forward-backward algorithm. in speech recognition.” In ACL Workshop on Teaching NLP and CL (2002).

Re-exports

pub use self::errors::Error;
pub use self::errors::Result;

Modules

Implementation of Hidden Markov Model with emission values from discrete distributions.

Implementation of Hidden Markov Model with emission values from discrete distributions and an optional explicity end state. This module also implements the Trainable trait allowing to be trainned by Baum-Welch algorithm.

Error definitions for the hmm module.

Implementation of Hidden Markov Models with emission values from univariate continuous distributions.

Structs

A newtype for HMM states.

Iterate over the states of a Model.

Transition between two states in a Model.

Iterate over all state transitions of a Model.

Traits

A trait for Hidden Markov Models (HMM) with generic Observation type.

A trait for trainning Hidden Markov Models (HMM) with generic Observation type using Baum-Welch algorithm.

Functions

Execute one step of Baum-Welch algorithm to find the maximum likelihood estimate of the parameters of a HMM given a set of observed feature vector and return the estimated initial state distribution (π*), estimated transition matrix (A*), estimated emission probabilities matrix (B*) and end probabilities vector (if the model has declared an end state beforehand). This function doesn’t update the HMM parameters in the model and has been implemented for Discrete Emissions Models only. It return values as LogProb.

Execute the forward algorithm and return the forward probabilites as LogProb values and the resulting forward probability.

Execute Viterbi algorithm on the given slice of Observation values to get the maximum a posteriori (MAP) probability.