Structs§
- ConstantLR
- Constant learning rate (no scheduling)
- Cosine
AnnealingLR - Cosine annealing scheduler with warm restarts
- Cosine
Annealing Warm Restarts - Cosine annealing with warm restarts
- ExponentialLR
- Exponential decay scheduler: multiply LR by gamma every epoch
- LinearLR
- Linear learning rate schedule
- Multi
StepLR - Multi-step decay: multiply LR by gamma at specific milestones
- OneCycleLR
- One cycle learning rate policy (popular for modern deep learning)
- ReduceLR
OnPlateau - Reduce learning rate on plateau (when validation loss stops improving)
- StepLR
- Step decay scheduler: multiply LR by gamma every step_size epochs
Enums§
Traits§
- Learning
Rate Scheduler - Learning rate scheduler trait for adaptive learning rate adjustment during training