Expand description
High level neural network building blocks such as Linear, activations, and tuples as Modules.
Also includes .save()
& .load()
for all Modules.
Saving and loading model parameters is done using SaveToNpz and LoadFromNpz. All modules provided here implement it,
including tuples. So you can call SaveToNpz::save() to save the module to a .npz
zip file
format, and then LoadFromNpz::load() to load the weights.
Randomizing parameters is done using ResetParams::reset_params(). All modules implement the underlying logic themselves. For example Linear calculates the distribution it draws values from based on its input size.
Structs
A Moduleself.p
.
A Module1.0 / N
.
Note that dropout() does not do anything for tensors with NoTape.
Implements layer normalization as described in Layer Normalization.
A linear transformation of the form x * transpose(W) + b
, where W
is a matrix, x
is a vector or matrix,
and b
is a vector. If x
is a matrix this does matrix multiplication.
Repeats T
N
times. This requires that T
’s input is the same as it’s output.
Represents a residual connection around F
: F(x) + x
,
as introduced in Deep Residual Learning for Image Recognition.
Enums
Error that can happen while loading data from a .npz
zip archive.
Traits
Something that can be loaded from a .npz
file (which is a zip
file).
A unit of a neural network. Acts on the generic Input
and produces Module::Output
.
Something that can be saved to a .npz
(which is a .zip
).
Functions
Reads data
from a file already in a zip archive named filename
.
Writes data
to a new file in a zip archive named filename
.