Expand description
this module provides the Activate trait alongside additional primitives and utilities
for activating neurons within a neural network.
Modules§
Structs§
Traits§
- Activate
Activateis a higher-kinded trait that provides a mechanism to apply a function over the elements within a container or structure.- Activator
- An
Activatordefines an interface for structural activation functions that can be applied onto various types. - Activator
Gradient - The
ActivatorGradienttrait extends theActivatortrait to include a method for computing the gradient of the activation function. - Heavyside
Activation - Linear
Activation - ReLU
Activation - Sigmoid
Activation - Softmax
Activation - Softmax
Axis - Compute the softmax activation along a specified axis.
- Tanh
Activation
Functions§
- heavyside
- Heaviside activation function:
- linear
- the
linearmethod is essentially a passthrough method often used in simple models or layers where no activation is needed. - linear_
derivative - the
linear_derivativemethod always returns1as it is a simple, single variable function - relu
- the relu activation function:
- relu_
derivative - sigmoid
- the sigmoid activation function:
- sigmoid_
derivative - the derivative of the sigmoid function
- softmax
- Softmax function:
- softmax_
axis - Softmax function along a specific axis:
- tanh
- Hyperbolic tangent
- tanh_
derivative - the derivative of the tanh function