Module traits

Source
Expand description

This module provides the core traits for the library, such as Backward and Forward

Modules§

apply
clip
codex
gradient
init
like
mask
norm
propagation
scalar
tensor
wnb

Traits§

ApplyGradient
A trait declaring basic gradient-related routines for a neural network
ApplyGradientExt
This trait extends the ApplyGradient trait by allowing for momentum-based optimization
ArrayLike
Backward
Backward propagate a delta through the system;
Biased
Clip
A trait denoting objects capable of being clipped between some minimum and some maximum.
ClipMut
This trait enables tensor clipping; it is implemented for ArrayBase
Codex
Decode
Decode defines a standard interface for decoding data.
DefaultLike
DropOut
[Dropout] randomly zeroizes elements with a given probability (p).
Encode
Encode defines a standard interface for encoding data.
FillLike
Forward
This trait denotes entities capable of performing a single forward step
Gradient
The Gradient trait defines a common interface for all gradients
Init
A trait for creating custom initialization routines for models or other entities.
InitInplace
This trait enables models to implement custom, in-place initialization methods.
L1Norm
a trait for computing the L1 norm of a tensor or array
L2Norm
a trait for computing the L2 norm of a tensor or array
NdLike
Norm
The Norm trait serves as a unified interface for various normalization routnines. At the moment, the trait provides L1 and L2 techniques.
Numerical
Numerical is a trait for all numerical types; implements a number of core operations
OnesLike
Scalar
The Scalar trait extends the Numerical trait to include additional mathematical operations for the purpose of reducing the number of overall traits required to complete various machine-learning tasks.
ScalarComplex
Tensor
Weighted
ZerosLike