Expand description
This module provides network specific implementations and traits supporting the development of neural network models.
Modules§
Structs§
- Layer
Base - The
LayerBaseimplementation works to provide a generic interface for layers within a neural network by associating an activation functionFwith a set of parametersP.
Traits§
- Model
- The
Modeltrait defines the core interface for all models; implementors will need to provide the type of configuration used by the model, the type of layout used by the model, and the type of parameters used by the model. The crate provides standard, or default, definitions of both the configuration and layout types, however, for - Model
Ext - Network
Config - The
NetworkConfigtrait defines an interface for compatible configurations within the framework, providing a layout and a key-value store to manage hyperparameters. - Network
Consts - A trait defining common constants for neural networks.
- Network
Params - Neural
Network - The
NeuralNetworktrait is used to define the network itself as well as each of its constituent parts. - RawContext
- RawLayer
- The
RawLayertrait establishes a common interface for all layers within a given model. Implementors will need to define the type of parameters they utilize, as well as provide methods to access both the activation function and the parameters of the layer. - RawLayer
Mut - The
RawLayerMuttrait extends theRawLayertrait by providing mutable access to the layer’s parameters and additional methods for training the layer, such as backward propagation and parameter updates.
Type Aliases§
- FnLayer
- A dynamic, functional alias of the [
Layer] implementation leveraging boxed closures. - Heavy
Side Layer - A [
Layer] type using the heavyside activation function. - Layer
Dyn - A dynamic instance of the layer using a boxed activator.
- Layer
Params - A type alias for an owned [
Layer] configured to use the standardParamsinstance - Layer
Params Base - A type alias for a layer configured to use the
ParamsBaseinstance - Linear
Layer - A type alias for a layer using a linear activation function.
- Relu
Layer - A [
Layer] type using the ReLU activation function. - Sigmoid
Layer - A type alias for a [
Layer] using a sigmoid activation function. - Tanh
Layer - An alias for a [
Layer] that uses the hyperbolic tangent function.