Expand description
Entropy calculation module for histograms.
This module provides functions to calculate the entropy (average code length) of data represented by histograms. The implementations are tuned for accuracy rather than raw performance, because the histogram only has 256 elements.
§Examples
Basic usage:
use lossless_transform_utils::histogram::Histogram32;
use lossless_transform_utils::entropy::code_length_of_histogram32;
let mut histogram = Histogram32::default();
histogram.inner.counter[0] = 3;
histogram.inner.counter[1] = 2;
histogram.inner.counter[2] = 1;
let entropy = code_length_of_histogram32(&histogram, 6);
println!("Entropy: {}", entropy);§Performance Note
The implementation in this module favours accuracy rather than performance, so it does
proper f64 arithmetic, with log2; which normally is slow.
However, because the input histograms only have 256 elements, the accuracy tradeoff for performance is considered worthwhile here.
Functions§
- code_
length_ of_ histogram32 - Calculates the ideal code length in bits for a given histogram. This lets us estimate how compressible the data is during ‘entropy coding’ steps.
- code_
length_ of_ histogram32_ no_ size - Calculates the ideal code length in bits for a given histogram. This lets us estimate how compressible the data is during ‘entropy coding’ steps.
- shannon_
entropy_ of_ histogram32 - Calculates the Shannon entropy of a Histogram32 using floating point arithmetic. The entropy is the average number of bits needed to represent each symbol.