markovr 
Higher-order Markov Chains can have longer memories than your typical Markov Chain, which looks back only 1 element. They are the basic building block for the WaveFunctionCollapse algorithm. A zeroth-order Markov Chain is the equivalent of a weighted die.
Usage
Add this to your Cargo.toml
:
[]
= { = "0.2"}
Alternatively, if you don't want to bring in the rand crate into your dependency tree:
[]
= { = "0.2", = []}
Then it's as simple as this:
// Create a new, first-order Markov Chain.
let mut m = new;
// Create a two-way mapping between your input data and u64s.
// Each of your inputs needs a unique u64 value.
// std::Hash can be your friend here, but let's do it ourselves.
// alpha will be both our encoding mapping and training data.
let alpha: = "abcdefghijklmnopqrstuvwxyz".chars.collect;
// encoded is a parallel Vec to alpha that contains the u64 unique ids
// for each character.
let encoded: = .map.collect;
// Train the model.
for i in m.order..encoded.len
// Generate values from the model.
for i in 0..