Expand description
The core types and logic implementing the minidx
crate.
Re-exports§
Modules§
- gradients
- layers
- Composable neural-network layers
- loss
- misc
- optimizers
- shapes
- Low-level types describing data dimensionality
Structs§
- Load
Save Error - An error loading or saving parameters.
Traits§
- Backprop
Module - Some sequential computation which can perform backprop on itself given trace state and the gradients of its outputs: computing parameter updates for itself.
- Dtype
- Represents a data type or element of an array that can have
arithmatic operations applied to it. The main difference
between Dtype and Unit is that
bool
is Unit, but not Dtype. - Float
- Trait for floating-point numbers.
- Loadable
Module - A module who’s parameters can be loaded or saved.
- Module
- A unit of computation that consumes
Input
and produces Module::Output. - Reset
Params - Something that can have its learnable parameters reset.
- RevModule
- A unit of computation which can do backpropagation without knowledge of any additional state.
- Traced
Module - Some sequential computation that consumes
Input
and produces Module::Output, but also produces artifacts describing the execution that can later be used during backprop. - Unit
- Represents a unit type, but no arithmetic.
- Visualizable
Unit - Something which can have its parameters visualized.
Functions§
- train_
batch - Does a training minibatch, updating a network based on averaged gradients from computing N input-output pairs.
- train_
batch_ parallel - Parallel version of train_batch. More threads is not necessarily faster.
- train_
step - Does a training step, updating a network using a pair of inputs and outputs.