Crate concision_traits

Crate concision_traits 

Source
Expand description

Core traits defining fundamental abstractions and operations useful for neural networks.

Modules§

math
Mathematically oriented operators and functions useful in machine learning contexts.
ops
composable operators for tensor manipulations and transformations, neural networks, and more
tensor

Traits§

Abs
Affine
apply an affine transformation to a tensor; affine transformation is defined as mul * self + add
Apply
Apply is a composable binary operator generally used to apply some object or function onto the caller to produce some output.
ApplyGradient
A trait declaring basic gradient-related routines for a neural network
ApplyGradientExt
This trait extends the ApplyGradient trait by allowing for momentum-based optimization
ApplyMut
ApplyMut provides an interface for mutable containers that can apply a function onto their elements, modifying them in place.
ApplyOnce
The ApplyOnce trait consumes the container and applies the given function to every element before returning a new container with the results.
ArrayLike
AsBiasDim
The AsBiasDim trait is used to define a type that can be used to get the bias dimension of the parameters.
AsComplex
AsComplex defines an interface for converting a reference of some numerical type into a complex number.
Backward
The Backward trait establishes a common interface for completing a single backward step in a neural network or machine learning model.
BackwardStep
Clip
A trait denoting objects capable of being clipped between some minimum and some maximum.
ClipMut
This trait enables tensor clipping; it is implemented for ArrayBase
Codex
Conjugate
Cos
Cosh
CrossEntropy
A trait for computing the cross-entropy loss of a tensor or array
Cubed
Decode
Decode defines a standard interface for decoding data.
Decrement
Decrement is a chainable trait that defines a decrement method, effectively removing a single unit from the original object to create another
DecrementAxis
The DecrementAxis is used as a unary operator for removing a single axis from a multidimensional array or tensor-like structure.
DecrementMut
The DecrementMut trait defines a decrement method that operates in place, modifying the original object.
DefaultLike
Dim
the Dim trait is used to define a type that can be used as a raw dimension. This trait is primarily used to provide abstracted, generic interpretations of the dimensions of the ndarray crate to ensure long-term compatibility.
DimConst
Encode
Encode defines a standard interface for encoding data.
Exp
FillLike
FloorDiv
Forward
The Forward trait describes a common interface for objects designated to perform a single forward step in a neural network or machine learning model.
ForwardMut
ForwardOnce
A consuming implementation of forward propagation
Gradient
the Gradient trait defines the gradient of a function, which is a function that takes an input and returns a delta, which is the change in the output with respect to the input.
Increment
The Increment
IncrementAxis
The IncrementAxis trait defines a method enabling an axis to increment itself, effectively adding a new axis to the array.
IncrementMut
InitWith
InitWith enables a container to
Initialize
Initialize provides a mechanism for initializing some object using a value of type T to produce another object.
IntoAxis
The IntoAxis trait is used to define a conversion routine that takes a type and wraps it in an Axis type.
IntoComplex
Trait for converting a type into a complex number.
Inverse
The Inverse trait generically establishes an interface for computing the inverse of a type, regardless of if its a tensor, scalar, or some other compatible type.
IsSquare
IsSquare is a trait for checking if the layout, or dimensionality, of a tensor is square.
L1Norm
a trait for computing the L1 norm of a tensor or array
L2Norm
a trait for computing the L2 norm of a tensor or array
Loss
The Loss trait defines a common interface for any custom loss function implementations. This trait requires the implementor to define their algorithm for calculating the loss between two values, lhs and rhs, which can be of different types, X and Y respectively. These terms are used generically to allow for flexibility in the allowed types, such as tensors, scalars, or other data structures while clearly defining the “order” in which the operations are performed. It is most common to expect the lhs to be the predicted output and the rhs to be the actual output, but this is not a strict requirement. The trait also defines an associated type Output, which represents the type of the loss value returned by the loss method. This allows for different loss functions to return different types of loss values, such as scalars or tensors, depending on the specific implementation of the loss function.
MapInto
MapInto defines an interface for containers that can consume themselves to apply a given function onto each of their elements.
MapTo
MapTo establishes an interface for containers capable of applying a given function onto each of their elements, by reference.
MaskFill
This trait is used to fill an array with a value based on a mask. The mask is a boolean array of the same shape as the array.
MatMul
The MatMul trait defines an interface for matrix multiplication.
MatPow
The MatPow trait defines an interface for computing the power of some matrix
MeanAbsoluteError
A trait for computing the mean absolute error of a tensor or array
MeanSquaredError
A trait for computing the mean squared error of a tensor or array
NdGradient
NdLike
NdTensor
Norm
The Norm trait serves as a unified interface for various normalization routnines. At the moment, the trait provides L1 and L2 techniques.
OnesLike
PercentChange
The PercentChange trait establishes a binary operator for computing the percent change between two values where the caller is considered the original value.
PercentDiff
Compute the percentage difference between two values. The percentage difference is defined as:
Predict
The Predict trait is designed as a model-specific interface for making predictions. In the future, we may consider opening the trait up allowing for an alternative implementation of the trait, but for now, it is simply implemented for all implementors of the Forward trait.
PredictWithConfidence
The PredictWithConfidence trait is an extension of the Predict trait, providing an additional method to obtain predictions along with a confidence score.
RawStore
The RawStore trait is used to define an interface for key-value stores like hash-maps, dictionaries, and similar data structures.
RawStoreMut
RawStoreMut extends the RawStore trait by introducing various mutable operations and accessors for elements within the store.
RawTensor
RawTensorData
Root
The Root trait provides methods for computing the nth root of a number.
RoundTo
ScalarTensorData
A marker trait used to denote tensors that represent scalar values; more specifically, we consider any type implementing the RawTensorData type where the Elem associated type is the implementor itself a scalar value.
Sine
Sinh
SquareRoot
Squared
Store
The Store trait is a more robust interface for key-value stores, building upon both RawStore and RawStoreMut traits by introducing an entry method for in-place manipulation of key-value pairs.
StoreEntry
The StoreEntry trait establishes a common interface for all entries within a key-value store. These types enable in-place manipulation of key-value pairs by allowing for keys to point to empty or vacant slots within the store.
SummaryStatistics
This trait describes the fundamental methods of summary statistics. These include the mean, standard deviation, variance, and more.
Tan
Tanh
TensorBase
Train
This trait defines the training process for the network
Transpose
The Transpose trait generically establishes an interface for transposing a type
Unsqueeze
The Unsqueeze trait establishes an interface for a routine that unsqueezes an array, by inserting a new axis at a specified position. This is useful for reshaping arrays to meet specific dimensional requirements.
ZerosLike