Expand description
Traits§
- Activate
- The
Activatetrait establishes a common interface for entities that can be activated according to some function - Activate
Ext - This trait extends the
Activatetrait with a number of additional activation functions and their derivatives. Note: this trait is automatically implemented for any type that implements theActivatetrait eliminating the need to implement it manually. - Activate
Mut - A trait for establishing a common mechanism to activate entities in-place.
- Heavyside
- Linear
Activation - NdActivate
Mut - ReLU
- Sigmoid
- Softmax
- Softmax
Axis - Tanh
Functions§
- heavyside
- Heaviside activation function
- relu
- the relu activation function: $f(x) = \max(0, x)$
- relu_
derivative - sigmoid
- the sigmoid activation function: $f(x) = \frac{1}{1 + e^{-x}}$
- sigmoid_
derivative - the derivative of the sigmoid function
- softmax
- Softmax function: $f(x_i) = \frac{e^{x_i}}{\sum_j e^{x_j}}$
- softmax_
axis - Softmax function along a specific axis: $f(x_i) = \frac{e^{x_i}}{\sum_j e^{x_j}}$
- tanh
- the tanh activation function: $f(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}$
- tanh_
derivative - the derivative of the tanh function