Module burn::tensor::activation

source ·
Expand description

The activation module.

Functions§

  • Applies the Gaussian Error Linear Units function as described in the paper in Gaussian Error Linear Units (GELUs).
  • Applies the leaky rectified linear unit function.
  • Applies the log sigmoid function.
  • Applies the log softmax function on the input tensor along the given dimension.
  • Applies the Mish function as described in the paper in Mish: A Self Regularized Non-Monotonic Neural Activation Function.
  • Applies Parametric ReLu activation PReLu(x) = max(0,x) + \alpha * min(0,x) tensor is assumed to be of shape [batch_size, channels, …] alpha is assumed to be of shape [channels] or [1]
  • Applies the “quiet softmax” function on the input tensor along the given dimension. This function is similar to the softmax function, but it allows for “no selection”, e.g., all outputs can tend to zero.
  • Applies the rectified linear unit function.
  • Applies the sigmoid function.
  • Applies the silu function
  • Applies the softmax function on the input tensor along the given dimension.
  • Applies the softplus function
  • Applies the tanh function