Functionsยง
- heavyside
- Heaviside activation function:
- linear
- the
linearmethod is essentially a passthrough method often used in simple models or layers where no activation is needed. - linear_
derivative - the
linear_derivativemethod always returns1as it is a simple, single variable function - relu
- the relu activation function:
- relu_
derivative - sigmoid
- the sigmoid activation function:
- sigmoid_
derivative - the derivative of the sigmoid function
- softmax
- Softmax function:
- softmax_
axis - Softmax function along a specific axis:
- tanh
- Hyperbolic tangent
- tanh_
derivative - the derivative of the tanh function