Module activate

Source
Expand description

this module is dedicated to activation function Activation functions for neural networks and their components. These functions are often used to introduce non-linearity into the model, allowing it to learn more complex patterns in the data.

§Overview

This module works to provide a complete set of activation utilities for neural networks, manifesting in a number of traits, utilities, and other primitives used to define various approaches to activation functions.

Traits§

Heavyside
LinearActivation
ReLU
Rho
The Rho trait defines a set of activation functions that can be applied to an implementor of the Apply trait. It provides methods for common activation functions such as linear, heavyside, ReLU, sigmoid, and tanh, along with their derivatives. The trait is generic over a type U, which represents the data type of the input to the activation functions. The trait also inherits a type alias Cont<U> to allow for variance w.r.t. the outputs of defined methods.
RhoComplex
The RhoComplex trait is similar to the Rho trait in that it provides various activation functions for implementos of the Apply trait, however, instead of being truly generic over a type U, it is generic over a type U that implements the ComplexFloat trait. This enables the use of complex numbers in the activation functions, something particularly useful for signal-based workloads.
Sigmoid
Softmax
SoftmaxAxis
Tanh

Functions§

heavyside
Heaviside activation function:
linear
the linear method is essentially a passthrough method often used in simple models or layers where no activation is needed.
linear_derivative
the linear_derivative method always returns 1 as it is a simple, single variable function
relu
the relu activation function:
relu_derivative
sigmoid
the sigmoid activation function:
sigmoid_derivative
the derivative of the sigmoid function
softmax
Softmax function:
softmax_axis
Softmax function along a specific axis:
tanh
the tanh activation function:
tanh_derivative
the derivative of the tanh function