ruvector-dither
Deterministic, low-discrepancy pre-quantization dithering for low-bit neural network inference on tiny devices (WASM, Seed, STM32).
Why dither?
Quantizers at 3/5/7 bits can align with power-of-two boundaries, producing idle tones, sticky activations, and periodic errors that degrade accuracy. A sub-LSB pre-quantization offset:
- Decorrelates the signal from grid boundaries.
- Pushes quantization error toward high frequencies (blue-noise-like), which average out downstream.
- Uses no RNG -- outputs are deterministic, reproducible across platforms (WASM / x86 / ARM), and cache-friendly.
Features
- Golden-ratio sequence -- best 1-D equidistribution, irrational period (never repeats).
- Pi-digit table -- 256-byte cyclic lookup, exact reproducibility from a tensor/layer ID.
- Per-channel dither pools -- structurally decorrelated channels without any randomness.
- Scalar, slice, and integer-code quantization helpers included.
no_std-compatible -- zero runtime dependencies; enable withfeatures = ["no_std"].
Quick start
use ;
// Golden-ratio dither, 8-bit, epsilon = 0.5 LSB
let mut gr = new;
let q = quantize_dithered;
assert!;
// Pi-digit dither, 5-bit
let mut pi = new;
let q2 = quantize_dithered;
assert!;
Per-channel batch quantization
use ChannelDither;
let mut cd = new;
let mut activations = vec!; // shape [batch=8, channels=8]
cd.quantize_batch;
Modules
| Module | Description |
|---|---|
golden |
GoldenRatioDither -- additive golden-ratio quasi-random sequence |
pi |
PiDither -- cyclic 256-byte table derived from digits of pi |
quantize |
quantize_dithered, quantize_slice_dithered, quantize_to_code |
channel |
ChannelDither -- per-channel dither pool seeded from layer/channel IDs |
Trait: DitherSource
Implement DitherSource to plug in your own deterministic sequence:
License
Licensed under either of Apache License, Version 2.0 or MIT License at your option.