mish_simd

Function mish_simd 

Source
pub fn mish_simd<F>(x: &ArrayView1<'_, F>) -> Array1<F>
where F: Float + SimdUnifiedOps,
Expand description

SIMD-accelerated Mish activation function

Computes the Mish activation function element-wise: Mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + exp(x)))

Mish is a self-regularized non-monotonic activation function that combines the benefits of ReLU, Swish, and other modern activations.

§Key Properties

  • Mish(0) = 0
  • Smooth and non-monotonic
  • Self-regularizing (bounded negative region)
  • Unbounded above, bounded below (minimum ≈ -0.31 at x ≈ -1.2)
  • Derivative: complex but well-behaved

§Usage in Deep Learning

Mish is used in:

  • YOLOv4 and YOLOv5 object detection
  • Modern convolutional neural networks
  • Image classification and segmentation tasks

§Examples

use scirs2_core::ndarray_ext::elementwise::mish_simd;
use scirs2_core::ndarray::array;

let x = array![0.0f64, 1.0, -1.0];
let result = mish_simd(&x.view());
// Mish(0) = 0
assert!(result[0].abs() < 1e-10);
// Mish(1) = 1 * tanh(ln(1 + e)) ≈ 0.8651
assert!((result[1] - 0.8650983882673103).abs() < 1e-10);