Module activation

Module activation 

Source
Expand description

Activation function kernels for neural networks

Implements common activation functions (ReLU, Sigmoid, etc.)

Structs§

GeluKernel
GELU (Gaussian Error Linear Unit) activation function kernel Used heavily in modern transformer models and neural networks
LeakyReluKernel
LeakyReLU activation function kernel LeakyReLU(x) = max(α*x, x) where α is typically 0.01
ReluKernel
ReLU activation function kernel
SigmoidKernel
Sigmoid activation function kernel
SwishKernel
Swish (SiLU) activation function kernel Swish(x) = x * sigmoid(x) = x / (1 + exp(-x))
TanhKernel
Tanh activation function kernel