Crate minidx_core

Source
Expand description

The core types and logic implementing the minidx crate.

Re-exports§

pub use gradients::Gradients;
pub use shapes::*;

Modules§

gradients
layers
Composable neural-network layers
loss
misc
optimizers
shapes
Low-level types describing data dimensionality

Structs§

LoadSaveError
An error loading or saving parameters.

Traits§

BackpropModule
Some sequential computation which can perform backprop on itself given trace state and the gradients of its outputs: computing parameter updates for itself.
Dtype
Represents a data type or element of an array that can have arithmatic operations applied to it. The main difference between Dtype and Unit is that bool is Unit, but not Dtype.
Float
Trait for floating-point numbers.
LoadableModule
A module who’s parameters can be loaded or saved.
Module
A unit of computation that consumes Input and produces Module::Output.
ResetParams
Something that can have its learnable parameters reset.
RevModule
A unit of computation which can do backpropagation without knowledge of any additional state.
TracedModule
Some sequential computation that consumes Input and produces Module::Output, but also produces artifacts describing the execution that can later be used during backprop.
Unit
Represents a unit type, but no arithmetic.
VisualizableUnit
Something which can have its parameters visualized.

Functions§

train_batch
Does a training minibatch, updating a network based on averaged gradients from computing N input-output pairs.
train_batch_parallel
Parallel version of train_batch. More threads is not necessarily faster.
train_step
Does a training step, updating a network using a pair of inputs and outputs.

Type Aliases§

Error