Module traits

Source

Modules§

activate
apply
clip
entropy
init
like
mask
norm
propagation
scalar
tensor

Traits§

Activate
ActivateGradient
ApplyGradient
A trait declaring basic gradient-related routines for a neural network
ApplyGradientExt
This trait extends the ApplyGradient trait by allowing for momentum-based optimization
ArrayLike
Backward
Backward propagate a delta through the system;
BinaryAction
Clip
A trait denoting objects capable of being clipped between some minimum and some maximum.
ClipMut
This trait enables tensor clipping; it is implemented for ArrayBase
CrossEntropy
A trait for computing the cross-entropy loss of a tensor or array
DefaultLike
DropOut
[Dropout] randomly zeroizes elements with a given probability (p).
FillLike
Forward
This trait denotes entities capable of performing a single forward step
Init
A trait for creating custom initialization routines for models or other entities.
InitInplace
This trait enables models to implement custom, in-place initialization methods.
L1Norm
a trait for computing the L1 norm of a tensor or array
L2Norm
a trait for computing the L2 norm of a tensor or array
MeanAbsoluteError
A trait for computing the mean absolute error of a tensor or array
MeanSquaredError
A trait for computing the mean squared error of a tensor or array
NdLike
Norm
The Norm trait serves as a unified interface for various normalization routnines. At the moment, the trait provides L1 and L2 techniques.
Numerical
Numerical is a trait for all numerical types; implements a number of core operations
OnesLike
Scalar
The Scalar trait extends the Numerical trait to include additional mathematical operations for the purpose of reducing the number of overall traits required to complete various machine-learning tasks.
ScalarComplex
Tensor
ZerosLike