Module tch::nn::init

source ·
Expand description

Variable initialization.

Enums

Number of features as input or output of a layer. In Kaiming initialization, choosing FanIn preserves the magnitude of the variance of the weights in the forward pass, choosing FanOut preserves this magnitude in the backward pass.
Variable initializations.
The non-linear function that follows this layer. ReLU is the recommended value.

Constants

Functions

Creates a new float tensor with the specified shape, device, and initialization.
Creates a new float tensor with the specified shape, device, and initialization.