logo

Module neuronika::nn::init[][src]

Expand description

Layers’ parameters initialization functions.

These initializers define a way to set the initial random weights of neuronika’s layers.

Using an initializer

You can freely access any learnable component of any layer, as their visibility is public, and pass them, via a mutable reference, to the initialization function of your choice.

use neuronika::nn;
use neuronika::nn::init::{calculate_gain, xavier_normal};

let mut lin = nn::Linear::new(10, 10);

xavier_normal(&lin.weight, calculate_gain("relu"));

Functions

Returns the fan_in and the fan_out.

Returns the recommended gain value for the given non-linearity function.

Fills the differentiable leaf variable with a constant value.

Fills the {3, 4, 5}-dimensional differentiable leaf variable with the Dirac delta function.

Fills the matrix differentiable leaf variable with the identity matrix.

Fills the differentiable leaf variable with elements drawn from the normal distribution N(mean, std^2).

Fills the differentiable leaf variable with ones.

Fills the differentiable leaf variable with elements drawn from the uniform distribution U(low, high).

Fills the differentiable leaf variable with values according to the method described in Understanding the difficulty of training deep feedforward neural networks - Glorot, X. & Bengio, Y. (2010), using a normal distribution.

Fills the differentiable leaf variable with values according to the method described in Understanding the difficulty of training deep feedforward neural networks - Glorot, X. & Bengio, Y. (2010), using a uniform distribution.

Fills the differentiable leaf variable with zeros.