Module xavier

Source
Expand description

§Xavier

Xavier initialization techniques were developed in 2010 by Xavier Glorot. These methods are designed to initialize the weights of a neural network in a way that prevents the vanishing and exploding gradient problems. The initialization technique manifests into two distributions: XavierNormal and XavierUniform.

Structs§

XavierNormal
Normal Xavier initializers leverage a normal distribution with a mean of 0 and a standard deviation (σ) computed by the formula: σ = sqrt(2/(d_in + d_out))
XavierUniform
Uniform Xavier initializers use a uniform distribution to initialize the weights of a neural network within a given range.