leaky_relu_simd

Function leaky_relu_simd 

Source
pub fn leaky_relu_simd<F>(x: &ArrayView1<'_, F>, alpha: F) -> Array1<F>
where F: Float + SimdUnifiedOps,
Expand description

Compute Leaky ReLU activation with SIMD acceleration.

Leaky ReLU is a variant of ReLU that allows small negative values: LeakyReLU(x) = max(αx, x) where α is a small constant (typically 0.01)

§Arguments

  • x - Input 1D array
  • alpha - Negative slope coefficient (typically 0.01)

§Returns

Array1<F> containing the Leaky ReLU activation output

§Performance

  • SIMD: Automatically used for large arrays (1000+ elements)
  • Speedup: 3-5x for large arrays

§Mathematical Definition

LeakyReLU(x) = { x     if x > 0
               { αx    if x ≤ 0

§Examples

use scirs2_core::ndarray::array;
use scirs2_core::ndarray_ext::preprocessing::leaky_relu_simd;

let x = array![-2.0, -1.0, 0.0, 1.0, 2.0];
let result = leaky_relu_simd(&x.view(), 0.01);

assert_eq!(result[0], -0.02); // -2.0 * 0.01
assert_eq!(result[1], -0.01); // -1.0 * 0.01
assert_eq!(result[2], 0.0);   //  0.0 * 0.01
assert_eq!(result[3], 1.0);   //  1.0 (unchanged)
assert_eq!(result[4], 2.0);   //  2.0 (unchanged)

§Applications

  • Addressing Dying ReLU Problem: Allows gradient flow for negative values
  • GANs: Common in discriminator networks
  • ResNet Variants: Alternative to standard ReLU