Crate axonml_optim

Crate axonml_optim 

Source
Expand description

axonml-optim - Optimization Algorithms

Provides optimizers for training neural networks, including:

  • SGD with momentum and Nesterov acceleration
  • Adam and AdamW
  • RMSprop
  • Learning rate schedulers

§Example

use axonml_optim::prelude::*;
use axonml_nn::{Linear, Module, Sequential};

// Create model
let model = Sequential::new()
    .add(Linear::new(784, 128))
    .add(Linear::new(128, 10));

// Create optimizer
let mut optimizer = Adam::new(model.parameters(), 0.001);

// Training loop
for epoch in 0..100 {
    let output = model.forward(&input);
    let loss = compute_loss(&output, &target);

    optimizer.zero_grad();
    loss.backward();
    optimizer.step();
}

@version 0.1.0 @author AutomataNexus Development Team

Re-exports§

pub use adam::Adam;
pub use adam::AdamW;
pub use lr_scheduler::CosineAnnealingLR;
pub use lr_scheduler::ExponentialLR;
pub use lr_scheduler::LRScheduler;
pub use lr_scheduler::MultiStepLR;
pub use lr_scheduler::OneCycleLR;
pub use lr_scheduler::ReduceLROnPlateau;
pub use lr_scheduler::StepLR;
pub use lr_scheduler::WarmupLR;
pub use optimizer::Optimizer;
pub use rmsprop::RMSprop;
pub use sgd::SGD;

Modules§

adam
Adam Optimizer - Adaptive Moment Estimation
lr_scheduler
Learning Rate Schedulers
optimizer
Optimizer Trait - Core Optimizer Interface
prelude
Common imports for optimization.
rmsprop
RMSprop Optimizer
sgd
SGD Optimizer - Stochastic Gradient Descent