Module activate

Source
Expand description

this module is dedicated to activation function This module implements various activation functions for neural networks.

§Traits

Traits§

Activate
The Activate trait establishes a common interface for entities that can be activated according to some function
ActivateExt
This trait extends the Activate trait with a number of additional activation functions and their derivatives. Note: this trait is automatically implemented for any type that implements the Activate trait eliminating the need to implement it manually.
ActivateMut
A trait for establishing a common mechanism to activate entities in-place.
Heavyside
LinearActivation
NdActivateMut
ReLU
Sigmoid
Softmax
SoftmaxAxis
Tanh

Functions§

heavyside
Heaviside activation function
relu
the relu activation function: $f(x) = \max(0, x)$
relu_derivative
sigmoid
the sigmoid activation function: $f(x) = \frac{1}{1 + e^{-x}}$
sigmoid_derivative
the derivative of the sigmoid function
softmax
Softmax function: $f(x_i) = \frac{e^{x_i}}{\sum_j e^{x_j}}$
softmax_axis
Softmax function along a specific axis: $f(x_i) = \frac{e^{x_i}}{\sum_j e^{x_j}}$
tanh
the tanh activation function: $f(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}$
tanh_derivative
the derivative of the tanh function