elu_simd

Function elu_simd 

Source
pub fn elu_simd<F>(x: &ArrayView1<'_, F>, alpha: F) -> Array1<F>
where F: Float + SimdUnifiedOps,
Expand description

Apply ELU (Exponential Linear Unit) activation using SIMD operations

ELU is defined as:

  • f(x) = x, if x >= 0
  • f(x) = α * (exp(x) - 1), if x < 0

ELU is used in deep neural networks to:

  • Push mean activations closer to zero (faster learning)
  • Have negative values (unlike ReLU) for better gradient flow
  • Have a smooth curve everywhere (unlike Leaky ReLU)

§Arguments

  • x - Input array
  • alpha - Scaling factor for negative inputs (commonly 1.0)

§Returns

  • Array with ELU applied elementwise

§Example

use scirs2_core::ndarray_ext::elementwise::elu_simd;
use ndarray::{array, ArrayView1};

let x = array![1.0_f32, 0.0, -1.0, -2.0];
let result = elu_simd(&x.view(), 1.0);
assert!((result[0] - 1.0).abs() < 1e-6);  // Positive: unchanged
assert!((result[1] - 0.0).abs() < 1e-6);  // Zero: unchanged
assert!(result[2] < 0.0);  // Negative: α * (exp(x) - 1) < 0