Expand description
Neural network optimizers
This module provides various optimization algorithms for neural networks such as SGD, Adam, RMSProp, Adagrad, etc.
The optimizers in this module are wrappers around the implementations in
the scirs2-optim
crate, adapted to work with the neural network API.
Re-exports§
pub use adagrad::Adagrad;
pub use adam::Adam;
pub use adamw::AdamW;
pub use lr_scheduler_wrapper::with_cosine_annealing;
pub use lr_scheduler_wrapper::with_step_decay;
pub use lr_scheduler_wrapper::LRSchedulerOptimizer;
pub use momentum::MomentumOptimizer;
pub use radam::RAdam;
pub use rmsprop::RMSprop;
pub use sgd::SGD;
Modules§
- adagrad
- Adagrad optimizer implementation
- adam
- Adam optimizer implementation for neural networks
- adamw
- AdamW optimizer implementation for neural networks
- lr_
scheduler_ wrapper - Learning rate scheduler integration for optimizers
- momentum
- Momentum optimizer for neural networks
- radam
- RAdam (Rectified Adam) optimizer implementation for neural networks
- rmsprop
- RMSprop optimizer implementation
- sgd
- Stochastic Gradient Descent optimizer for neural networks
Traits§
- Optimizer
- Trait for neural network optimizers
- Optimizer
Step - Extension trait for optimizers that can work with model layers