axonml-optim
Optimizers and learning rate schedulers for Axonml.
Optimizers
- SGD - With momentum, Nesterov, weight decay
- Adam - Adaptive learning rates
- AdamW - Decoupled weight decay
- RMSprop - Moving average of squared gradients
LR Schedulers
- StepLR, MultiStepLR, ExponentialLR
- CosineAnnealingLR, OneCycleLR, WarmupLR
Usage
use ;
let mut optimizer = new;
optimizer.zero_grad;
loss.backward;
optimizer.step;
License
MIT OR Apache-2.0