Expand description
This library provides the core abstractions and utilities for the Concision framework.
§Features
- ParamsBase: A structure for defining the parameters within a neural network.
- Backward: This trait denotes a single backward pass through a layer of a neural network.
- Forward: This trait denotes a single forward pass through a layer of a neural network.
Re-exports§
pub use super::Activate;pub use super::ActivateGradient;pub use super::BinaryAction;pub use super::error::ParamsError;pub use super::params::ParamsBase;pub use super::Params;pub use super::ParamsView;pub use super::ParamsViewMut;
Modules§
- activate
- This module implements various activation functions for neural networks.
- error
- init
- This module works to provide the crate with various initialization methods suitable for machine-learning models.
- loss
- This module provides various loss functions used in machine learning.
- ops
- params
- Parameters for constructing neural network models. This module implements parameters using the ParamsBase struct and its associated types. The ParamsBase struct provides:
- prelude
- traits
- utils
- A suite of utilities tailored toward neural networks.
Structs§
- FftDirection
Iter - An iterator over the variants of FftDirection
- FftMode
Iter - An iterator over the variants of FftMode
- FftPlan
- PadAction
Iter - An iterator over the variants of PadAction
- Padding
Enums§
Traits§
- Affine
- apply an affine transformation to a tensor;
affine transformation is defined as
mul * self + add - Apply
Gradient - A trait declaring basic gradient-related routines for a neural network
- Apply
Gradient Ext - This trait extends the ApplyGradient trait by allowing for momentum-based optimization
- Array
Like - AsComplex
- Backward
- Backward propagate a delta through the system;
- Clip
- A trait denoting objects capable of being clipped between some minimum and some maximum.
- ClipMut
- This trait enables tensor clipping; it is implemented for
ArrayBase - Codex
- Cross
Entropy - A trait for computing the cross-entropy loss of a tensor or array
- DFT
- Trait for computing the Discrete Fourier Transform (DFT) of a sequence.
- Decode
- Decode defines a standard interface for decoding data.
- Decrement
Axis - This trait enables an array to remove an axis from itself
- Default
Like - DropOut
- [Dropout] randomly zeroizes elements with a given probability (
p). - Encode
- Encode defines a standard interface for encoding data.
- Fill
Like - Floor
Div - Forward
- This trait denotes entities capable of performing a single forward step
- Heavyside
- Increment
Axis - Init
- A trait for creating custom initialization routines for models or other entities.
- Init
Inplace - This trait enables models to implement custom, in-place initialization methods.
- Into
Axis - Into
Complex - Trait for converting a type into a complex number.
- Inverse
- this trait enables the inversion of a matrix
- IsSquare
- L1Norm
- a trait for computing the L1 norm of a tensor or array
- L2Norm
- a trait for computing the L2 norm of a tensor or array
- Linear
Activation - Mask
Fill - This trait is used to fill an array with a value based on a mask. The mask is a boolean array of the same shape as the array.
- Matmul
- A trait denoting objects capable of matrix multiplication.
- Matpow
- a trait denoting objects capable of matrix exponentiation
- Mean
Absolute Error - Compute the mean absolute error (MAE) of the object.
- Mean
Squared Error - Compute the mean squared error (MSE) of the object.
- NdActivate
- NdActivate
Mut - NdLike
- Norm
- The Norm trait serves as a unified interface for various normalization routnines. At the moment, the trait provides L1 and L2 techniques.
- Numerical
- Numerical is a trait for all numerical types; implements a number of core operations
- Ones
Like - Pad
- Percent
Diff - Compute the percentage difference between two values. The percentage difference is defined as:
- ReLU
- Root
- RoundTo
- Scalar
- The Scalar trait extends the Numerical trait to include additional mathematical operations for the purpose of reducing the number of overall traits required to complete various machine-learning tasks.
- Scalar
Complex - Sigmoid
- Softmax
- Softmax
Axis - Summary
Statistics - This trait describes the fundamental methods of summary statistics. These include the mean, standard deviation, variance, and more.
- Tanh
- Tensor
- Transpose
- the trait denotes the ability to transpose a tensor
- Unsqueeze
- Zeros
Like
Functions§
- calculate_
pattern_ similarity - Calculate similarity between two patterns
- clip_
gradient - Clip the gradient to a maximum value.
- clip_
inf_ nan - concat_
iter - Creates an n-dimensional array from an iterator of n dimensional arrays.
- extract_
patterns - Extract common patterns from historical sequences
- fft
- Computes the Fast Fourier Transform of a one-dimensional, complex-valued signal.
- floor_
div - genspace
- heavyside
- Heaviside activation function
- hstack
- ifft
- Computes the Inverse Fast Fourier Transform of an one-dimensional, complex-valued signal.
- inverse
- irfft
- Computes the Inverse Fast Fourier Transform of an one-dimensional, real-valued signal. TODO: Fix the function; currently fails to compute the correct result
- is_
similar_ pattern - Check if two patterns are similar enough to be considered duplicates
- layer_
norm - layer_
norm_ axis - linarr
- pad
- pad_to
- relu
- relu_
derivative - rfft
- Computes the Fast Fourier Transform of an one-dimensional, real-valued signal. TODO: Optimize the function to avoid unnecessary computation.
- round_
to - Round the given value to the given number of decimal places.
- sigmoid
- the sigmoid activation function: $f(x) = \frac{1}{1 + e^{-x}}$
- sigmoid_
derivative - the derivative of the sigmoid function
- softmax
- softmax_
axis - stack_
iter - Creates a larger array from an iterator of smaller arrays.
- tanh
- tanh_
derivative - tril
- Returns the lower triangular portion of a matrix.
- triu
- Returns the upper triangular portion of a matrix.
- vstack