pub fn normalized_token_entropy(text: &str) -> f64Expand description
Normalized Shannon entropy: H(X) / log₂(n) where n = number of unique symbols. Returns a value in [0, 1] where 0 = perfectly predictable, 1 = maximum entropy. This makes thresholds comparable across different alphabet sizes.