Module optim

Module optim 

Source
Expand description

Optimization algorithms for gradient-based learning.

§Usage

use aprender::optim::SGD;
use aprender::primitives::Vector;

// Create optimizer with learning rate 0.01
let mut optimizer = SGD::new(0.01);

// Initialize parameters and gradients
let mut params = Vector::from_slice(&[1.0, 2.0, 3.0]);
let gradients = Vector::from_slice(&[0.1, 0.2, 0.3]);

// Update parameters
optimizer.step(&mut params, &gradients);

// Parameters are updated: params = params - lr * gradients
assert!((params[0] - 0.999).abs() < 1e-6);

Structs§

Adam
Adam (Adaptive Moment Estimation) optimizer.
SGD
Stochastic Gradient Descent optimizer.

Traits§

Optimizer
Trait for optimizers that update parameters based on gradients.