Expand description
This library provides the core abstractions and utilities for the Concision framework.
§Features
- ParamsBase: A structure for defining the parameters within a neural network.
- Backward: This trait denotes a single backward pass through a layer of a neural network.
- Forward: This trait denotes a single forward pass through a layer of a neural network.
Modules§
- activate
- This module implements various activation functions for neural networks.
- data
- this module implements a dataset abstraction for machine learning tasks.
- error
- init
- This module works to provide the crate with various initialization methods suitable for machine-learning models.
- math
- A suite of mathematical tool and utilities tailored toward neural networks.
- ops
- params
- Parameters for constructing neural network models. This module implements parameters using the ParamsBase struct and its associated types. The ParamsBase struct provides:
- prelude
- traits
- types
- utils
Structs§
- PadAction
Iter - An iterator over the variants of PadAction
- Padding
- Params
Base - this structure extends the
ArrayBasetype to include bias
Enums§
Traits§
- Affine
- apply an affine transformation to a tensor;
affine transformation is defined as
mul * self + add - Apply
Gradient - A trait declaring basic gradient-related routines for a neural network
- Apply
Gradient Ext - This trait extends the ApplyGradient trait by allowing for momentum-based optimization
- Array
Like - Backward
- A simple trait denoting a single backward pass through a layer of a neural network; the trait
- Clip
- ClipMut
- This trait enables tensor clipping; it is implemented for
ArrayBase - Cross
Entropy - Decrement
Axis - This trait enables an array to remove an axis from itself
- Default
Like - DropOut
- [Dropout] randomly zeroizes elements with a given probability (
p). - Fill
Like - Forward
- This trait defines the forward pass of the network
- Heavyside
- Increment
Axis - Init
- The Init trait is a consuming initialization method
- Init
Inplace - A trait for initializing an object in-place
- Into
Axis - Inverse
- this trait enables the inversion of a matrix
- IsSquare
- L1Norm
- a trait for computing the L1 norm of a tensor or array
- L2Norm
- a trait for computing the L2 norm of a tensor or array
- Linear
Activation - Mask
Fill - This trait is used to fill an array with a value based on a mask. The mask is a boolean array of the same shape as the array.
- Matmul
- A trait denoting objects capable of matrix multiplication.
- Matpow
- a trait denoting objects capable of matrix exponentiation
- Mean
Absolute Error - Mean
Squared Error - NdActivate
- NdActivate
Mut - NdLike
- Norm
- Ones
Like - Pad
- Predict
- This trait defines the prediction of the network
- ReLU
- Sigmoid
- Softmax
- Softmax
Axis - Tanh
- Transpose
- the trait denotes the ability to transpose a tensor
- Unsqueeze
- Zeros
Like
Functions§
- calculate_
pattern_ similarity - Calculate similarity between two patterns
- clip_
gradient - Clip the gradient to a maximum value.
- clip_
inf_ nan - concat_
iter - Creates an n-dimensional array from an iterator of n dimensional arrays.
- extract_
patterns - Extract common patterns from historical sequences
- genspace
- heavyside
- Heaviside activation function
- hstack
- inverse
- is_
similar_ pattern - Check if two patterns are similar enough to be considered duplicates
- linarr
- pad
- pad_to
- relu
- relu_
derivative - sigmoid
- the sigmoid activation function: $f(x) = \frac{1}{1 + e^{-x}}$
- sigmoid_
derivative - the derivative of the sigmoid function
- softmax
- softmax_
axis - stack_
iter - Creates a larger array from an iterator of smaller arrays.
- tanh
- tanh_
derivative - tril
- Returns the lower triangular portion of a matrix.
- triu
- Returns the upper triangular portion of a matrix.
- vstack
Type Aliases§
- PadResult
- Params
- a type alias for owned parameters
- Params
View - a type alias for an immutable view of the parameters
- Params
View Mut - a type alias for a mutable view of the parameters
- Result
- a type alias for a Result with a Error