Module activate

Module activate 

Source
Expand description

this module provides the Activate trait alongside additional primitives and utilities for activating neurons within a neural network.

Modules§

rho
this module defines structural implementations of various activation functions
utils

Structs§

HeavySide
HyperbolicTangent
Linear
ReLU
Sigmoid
Softmax

Traits§

Activate
Activate is a higher-kinded trait that provides a mechanism to apply a function over the elements within a container or structure.
Activator
An Activator defines an interface for structural activation functions that can be applied onto various types.
ActivatorGradient
The ActivatorGradient trait extends the Activator trait to include a method for computing the gradient of the activation function.
HeavysideActivation
LinearActivation
ReLUActivation
SigmoidActivation
SoftmaxActivation
SoftmaxAxis
Compute the softmax activation along a specified axis.
TanhActivation

Functions§

heavyside
Heaviside activation function:
linear
the linear method is essentially a passthrough method often used in simple models or layers where no activation is needed.
linear_derivative
the linear_derivative method always returns 1 as it is a simple, single variable function
relu
the relu activation function:
relu_derivative
sigmoid
the sigmoid activation function:
sigmoid_derivative
the derivative of the sigmoid function
softmax
Softmax function:
softmax_axis
Softmax function along a specific axis:
tanh
Hyperbolic tangent
tanh_derivative
the derivative of the tanh function