Expand description
Dense layer implementation.
A Layer is a dense affine transform followed by an element-wise activation:
z = W x + by = activation(z)
The activation is stored in the layer so an Mlp can mix activation functions
across layers.
Shape mismatches are treated as programmer error and will panic via assert!.
Structs§
- Layer
- A dense layer:
y = activation(Wx + b).
Enums§
- Init
- Initialization scheme for layer weights.