Expand description
Language models.
This module provides n-gram language models that can be trained on tokenized text and used to score and generate word sequences.
§Example
use rustling::lm::LanguageModel;
// Create a bigram MLE language model
let mut model = LanguageModel::new_mle(2).unwrap();
model.fit(vec![
vec!["the".into(), "cat".into(), "sat".into()],
vec!["the".into(), "dog".into(), "ran".into()],
]);
let score = model.score("cat".into(), Some(vec!["the".into()])).unwrap();
assert!((score - 0.5).abs() < 1e-9);Structs§
- Language
Model - An n-gram language model.