selu_simd

Function selu_simd 

Source
pub fn selu_simd<F>(x: &ArrayView1<'_, F>) -> Array1<F>
where F: Float + SimdUnifiedOps,
Expand description

Apply SELU (Scaled Exponential Linear Unit) activation using SIMD operations

SELU is defined as:

  • f(x) = λ * x, if x > 0
  • f(x) = λ * α * (exp(x) - 1), if x <= 0

where λ ≈ 1.0507 and α ≈ 1.6733 are fixed constants.

SELU is the key activation for Self-Normalizing Neural Networks (SNNs):

  • Automatically maintains mean ≈ 0 and variance ≈ 1 through layers
  • Eliminates the need for Batch Normalization
  • Requires LeCun Normal initialization for weights
  • Works best with fully-connected networks

§Arguments

  • x - Input array

§Returns

  • Array with SELU applied elementwise

§Example

use scirs2_core::ndarray_ext::elementwise::selu_simd;
use ndarray::{array, ArrayView1};

let x = array![1.0_f32, 0.0, -1.0, -2.0];
let result = selu_simd(&x.view());
assert!(result[0] > 1.0);  // Positive: scaled by λ ≈ 1.0507
assert!((result[1] - 0.0).abs() < 1e-6);  // Zero: unchanged
assert!(result[2] < 0.0);  // Negative: λ * α * (exp(x) - 1) < 0