Expand description
SIMD-optimized optimization algorithms
This module implements high-performance optimization algorithms using SIMD instructions for machine learning applications including gradient descent, coordinate descent, and Newton-type methods.
Structs§
- Coordinate
Descent - SIMD-optimized coordinate descent optimizer
- Gradient
Descent - SIMD-optimized gradient descent optimizer
- Quasi
Newton - SIMD-optimized quasi-Newton optimizer (L-BFGS)
Functions§
- simd_
axpy - SIMD-optimized AXPY operation: y = alpha * x + y
- simd_
momentum_ update - SIMD-optimized momentum update: v = momentum * v + grad
- simd_
scale - SIMD-optimized scaling: x = alpha * x