Mini MCMC
A small (and growing) Rust library for Markov Chain Monte Carlo (MCMC) methods.
Installation
Once published on crates.io, add the following to your Cargo.toml:
[]
= "0.2.0"
Then you can use mini_mcmc in your Rust code.
Example: Sampling From a 2D Gaussian
use ChainRunner;
use MetropolisHastings;
use ;
You can also find this example at examples/minimal_mh.rs.
Example: Sampling From a Custom Distribution
Below we define a custom Poisson distribution for nonnegative integer states (({0,1,2,\dots})) and a simple random-walk proposal. We then run Metropolis–Hastings to sample from this distribution, collecting frequencies of (k) after some burn-in:
use ChainRunner;
use ;
use MetropolisHastings;
use Rng; // for thread_rng
/// A Poisson(\lambda) distribution, seen as a discrete target over k=0,1,2,...
/// A simple random-walk proposal in the nonnegative integers:
/// - If current_state=0, propose 0 -> 1 always
/// - Otherwise propose x->x+1 or x->x-1 with p=0.5 each
;
// A small helper for computing ln(k!)
You can also find this example at examples/poisson_mh.rs.
Explanation
-
PoissonTargetimplementsTarget<usize, f64>for a discrete Poisson((\lambda)) distribution: [ p(k) = e{-\lambda}, \frac{\lambdak}{k!},\quad k=0,1,2,\ldots ] In log form, (\log p(k) = -\lambda + k \log(\lambda) - \log(k!)). -
NonnegativeProposalprovides a random-walk in the set ({0,1,2,\dots}):- If (x=0), propose (1) with probability 1.
- If (x>0), propose (x+1) or (x-1) with probability 0.5 each.
log_probreturns (\ln(0.5)) for the valid moves, or (-\infty) for invalid moves.
-
Usage:
We start the chain at (k=0), run 10,000 iterations discarding 1,000 as burn-in, and tally the final sample frequencies for (k=0..20). They should approximate the Poisson(4.0) distribution (peak around (k=4)).
With this example, you can see how to use mini_mcmc for unbounded discrete distributions via a custom random-walk proposal and a log‐PMF.
Overview
This library provides:
- Metropolis-Hastings: A generic implementation suitable for various target distributions and proposal mechanisms.
- Distributions: Handy Gaussian and isotropic Gaussian implementations, along with traits for defining custom log-prob functions.
Roadmap
- Parallel Chains: Run multiple Metropolis-Hastings Markov chains in parallel. ✅
- Discrete & Continuous Distributions: Get Metropolis-Hastings to work for continuous and discrete distributions. ✅
- Generic Datatypes: Support sampling vectors of arbitrary integer or floating point types. ✅
- Gibbs Sampling: A component-wise MCMC approach for higher-dimensional problems. ✅
- Hamiltonian Monte Carlo (HMC): A gradient-based method for efficient exploration.
- No-U-Turn Sampler (NUTS): An extension of HMC that removes the need to choose path lengths.
- Ensemble Slice Sampling (ESS): Efficient gradient-free sampler, see paper.
Structure
src/lib.rs: The main library entry point—exports MCMC functionality.src/distributions.rs: Target distributions (e.g., multivariate Gaussians) and proposal distributions.src/metropolis_hastings.rs: The Metropolis-Hastings algorithm implementation.src/gibbs.rs: The Gibbs sampling algorithm implementation.examples/demo.rs: Example usage demonstrating 2D Gaussian sampling and plotting.
Usage (Local)
-
Build (Library + Demo):
-
Run the Demo:
Prints basic statistics of the MCMC chain (e.g., estimated mean). Saves a scatter plot of sampled points in
scatter_plot.pngand a Parquet filesamples.parquet.
Optional Features
csv: Enables CSV I/O for samples.arrow/parquet: Enables Apache Arrow / Parquet I/O.- By default, all features are enabled. You can disable them if you want a smaller build.
License
Licensed under the Apache License, Version 2.0. See LICENSE for details.
This project includes code from the kolmogorov_smirnov project, licensed under Apache 2.0 as noted in NOTICE.