Skip to main content

Module layer

Module layer 

Source
Expand description

Dense layer implementation.

A Layer is a dense affine transform followed by an element-wise activation:

  • z = W x + b
  • y = activation(z)

The activation is stored in the layer so an Mlp can mix activation functions across layers.

Shape mismatches are treated as programmer error and will panic via assert!.

Structs§

Layer
A dense layer: y = activation(Wx + b).

Enums§

Init
Initialization scheme for layer weights.