Skip to main content

Module weight_init

Module weight_init 

Source
Expand description

Weight initialization strategies for neural network parameters.

Provides common initialization methods including Xavier/Glorot, Kaiming/He, LeCun, orthogonal, and basic constant/normal/uniform initializations. Uses a deterministic LCG-based RNG (no rand crate dependency).

Structs§

InitRng
A deterministic linear congruential generator (LCG) for reproducible weight initialization without depending on the rand crate.
InitStats
Statistics about an initialized weight tensor.

Enums§

FanMode
Selects whether to use fan_in or fan_out for Kaiming initialization.
InitError
Errors that can occur during weight initialization.

Functions§

compute_fans
Compute (fan_in, fan_out) from a weight tensor shape.
constant_init
Initialize all elements to a constant value.
gain_for_activation
Return the recommended gain for a given activation function name.
kaiming_normal
Kaiming (He) normal initialization.
kaiming_uniform
Kaiming (He) uniform initialization.
lecun_normal
LeCun normal initialization: N(0, 1/sqrt(fan_in)).
lecun_uniform
LeCun uniform initialization: U(-limit, limit) where limit = sqrt(3/fan_in).
normal_init
Normal initialization with specified mean and standard deviation.
ones_init
Initialize all elements to one.
orthogonal_init
Orthogonal initialization via QR-like Gram-Schmidt on a random matrix.
uniform_init
Uniform initialization with specified bounds [low, high).
xavier_normal
Xavier (Glorot) normal initialization.
xavier_uniform
Xavier (Glorot) uniform initialization.
zeros_init
Initialize all elements to zero.