Skip to main content

normalized_token_entropy

Function normalized_token_entropy 

Source
pub fn normalized_token_entropy(text: &str) -> f64
Expand description

Normalized Shannon entropy: H(X) / log₂(n) where n = number of unique symbols. Returns a value in [0, 1] where 0 = perfectly predictable, 1 = maximum entropy. This makes thresholds comparable across different alphabet sizes.