[][src]Crate markovian

Simulation of (sub-)stochastic processes.

Goal

Serve as an extension of the rand crate for sub-stochastic processes.

Examples

Discrete time

Construction of a random walk in the integers.

let init_state: i32 = 0;
let transition = |state: &i32| raw_dist![(0.5, state + 1), (0.5, state - 1)];
let rng = thread_rng();
let mut mc = markovian::MarkovChain::new(init_state, transition, rng);

Branching process

Construction using density p(0) = 0.3, p(1) = 0.4, p(2) = 0.3.

let init_state: u32 = 1;
let base_distribution = raw_dist![(0.3, 0), (0.4, 1), (0.3, 2)];
let rng = thread_rng();
let mut branching_process = markovian::BranchingProcess::new(init_state, base_distribution, rng);

Continuous time

Construction of a random walk in the integers, with expponential time for each transition.

let init_state: i32 = 0;
struct MyTransition;
impl markovian::Transition<i32, (f64, i32)> for MyTransition {
    fn sample_from<R: ?Sized>(&self, state: &i32, rng: &mut R) -> (f64, i32)
    where
        R: Rng
    {
        let time = Exp::new(2.0).unwrap().sample(rng);
        let step = Uniform::from(0..=1).sample(rng) * 2 - 1;
        (time, state + step)
    }
}
let transition = MyTransition;
let rng = thread_rng();
let mut mc = markovian::TimedMarkovChain::new(init_state, transition, rng);

Remarks

All methods are inline, by design.

Non-trivial ways to use the crate are described below, including time dependence, continuous space and non-markovian processes.

Time dependence

Include the time as part of the state of the process.

Examples

A random walk on the integers that tends to move more to the right as time goes by.

let init_state: (usize, i32) = (0, 0);
let transition = |(time, state): &(usize, i32)| raw_dist![
    (0.6 - 1.0 / (time + 2) as f64, (time + 1, state + 1)),
    (0.4 + 1.0 / (time + 2) as f64, (time + 1, state - 1))
];
let rng = thread_rng();
let mut mc = markovian::MarkovChain::new(init_state, &transition, rng);
 
// Take a sample of 10 elements 
mc.take(10).map(|(_, state)| state).collect::<Vec<i32>>();

Continuous space

Randomize the transition: return a random element together with a probability one

Examples

A random walk on the real line with variable step size.

let init_state: f64 = 0.0;
struct MyTransition;
impl markovian::Transition<f64, f64> for MyTransition {
    fn sample_from<R: ?Sized>(&self, state: &f64, rng: &mut R) -> f64
    where
        R: Rng
    {
        let step = Exp::new(2.0).unwrap().sample(rng);
        state + step
    }
}
let transition = MyTransition;
let rng = thread_rng();
let mut mc = markovian::MarkovChain::new(init_state, transition, rng);
mc.next();
  
// current_state is positive 
assert!(mc.state().unwrap() > &0.0);

Non markovian

Include history in the state. For example, instead of i32, use Vec<i32>.

Examples

A random walk on the integers that is atracted to zero in a non markovian fashion.

let init_state: Vec<i32> = vec![0];
let transition = |state: &Vec<i32>| {
    // New possible states
    let mut right = state.clone();
    right.push(state[state.len() - 1] + 1);
    let mut left = state.clone();
    left.push(state[state.len() - 1] - 1);
 
    // Some non markovian transtion
    let path_stadistic: i32 = state.iter().sum();
    if path_stadistic.is_positive() {
        raw_dist![
            (1.0 / (path_stadistic.abs() + 1) as f64, right), 
            (1.0 - 1.0 / (path_stadistic.abs() + 1) as f64, left)
        ]
    } else {
        raw_dist![
            (1.0 - 1.0 / (path_stadistic.abs() + 1) as f64, right), 
            (1.0 / (path_stadistic.abs() + 1) as f64, left)
        ]
    }
};
let rng = thread_rng();
let mut mc = markovian::MarkovChain::new(init_state, transition, rng);
  
// state has history
mc.next();
assert_eq!(mc.state().unwrap().len(), 2);

Modules

distributions

Ease interoperability with rand_distr crate.

errors

Errors of this crate.

prelude

Ease of use of this crate in general.

Macros

raw_dist

Creates a Raw struct by first allocating a Vec and passing it to Raw::new.

Structs

BranchingProcess

Branching process in the natural numbers NN = {0, 1, 2, ...}.

ContFiniteMarkovChain

Finite state Markov Chain in continuous time.

FiniteMarkovChain

Finite state Markov Chain in discrete time.

MarkovChain

Markov Chain in discrete time, with arbitrary space.

TimedMarkovChain

Markov Chain in continuous time, with arbitrary space.

Traits

State

Possible public state.

StateIterator

Iterator with an internal state that is thought as the "zero" element.

Transition

Abstraction over transition matrix.