swish_simd

Function swish_simd 

Source
pub fn swish_simd<F>(x: &ArrayView1<'_, F>) -> Array1<F>
where F: Float + SimdUnifiedOps,
Expand description

SIMD-accelerated Swish (SiLU - Sigmoid Linear Unit) activation function

Computes the Swish activation function element-wise: Swish(x) = x * sigmoid(x) = x / (1 + exp(-x))

Swish is a self-gated activation function discovered through neural architecture search that has become a popular choice for modern neural networks.

§Key Properties

  • Swish(0) = 0
  • Swish is smooth and non-monotonic
  • Has a small negative region (unlike ReLU)
  • Self-gating: x modulates its own activation via sigmoid
  • Unbounded above, bounded below (minimum ≈ -0.278 at x ≈ -1.278)

§Usage in Deep Learning

Swish is used in:

  • EfficientNet and EfficientNetV2
  • GPT-NeoX and other large language models
  • MobileNetV3
  • Many modern vision and NLP architectures

§Examples

use scirs2_core::ndarray_ext::elementwise::swish_simd;
use scirs2_core::ndarray::array;

let x = array![0.0f64, 1.0, -1.0];
let result = swish_simd(&x.view());
assert!(result[0].abs() < 1e-10);  // Swish(0) = 0
// Swish(1) ≈ 0.7311
assert!((result[1] - 0.7310585786).abs() < 1e-6);
// Swish(-1) ≈ -0.2689
assert!((result[2] - (-0.2689414214)).abs() < 1e-6);