Expand description
Weight initialization strategies for neural network parameters.
Provides common initialization methods including Xavier/Glorot, Kaiming/He,
LeCun, orthogonal, and basic constant/normal/uniform initializations.
Uses a deterministic LCG-based RNG (no rand crate dependency).
Structs§
- InitRng
- A deterministic linear congruential generator (LCG) for reproducible
weight initialization without depending on the
randcrate. - Init
Stats - Statistics about an initialized weight tensor.
Enums§
- FanMode
- Selects whether to use fan_in or fan_out for Kaiming initialization.
- Init
Error - Errors that can occur during weight initialization.
Functions§
- compute_
fans - Compute
(fan_in, fan_out)from a weight tensor shape. - constant_
init - Initialize all elements to a constant value.
- gain_
for_ activation - Return the recommended gain for a given activation function name.
- kaiming_
normal - Kaiming (He) normal initialization.
- kaiming_
uniform - Kaiming (He) uniform initialization.
- lecun_
normal - LeCun normal initialization: N(0, 1/sqrt(fan_in)).
- lecun_
uniform - LeCun uniform initialization: U(-limit, limit) where
limit = sqrt(3/fan_in). - normal_
init - Normal initialization with specified mean and standard deviation.
- ones_
init - Initialize all elements to one.
- orthogonal_
init - Orthogonal initialization via QR-like Gram-Schmidt on a random matrix.
- uniform_
init - Uniform initialization with specified bounds
[low, high). - xavier_
normal - Xavier (Glorot) normal initialization.
- xavier_
uniform - Xavier (Glorot) uniform initialization.
- zeros_
init - Initialize all elements to zero.