Structs§
- ConstantLR
- Constant learning rate (no scheduling)
- Cosine
AnnealingLR - Cosine annealing scheduler with warm restarts
- Cosine
Annealing Warm Restarts - Cosine annealing with warm restarts
- CyclicalLR
- Cyclical learning rate policy with different modes
- ExponentialLR
- Exponential decay scheduler: multiply LR by gamma every epoch
- LRSchedule
Visualizer - Learning rate schedule visualization helper
- LinearLR
- Linear learning rate schedule
- Multi
StepLR - Multi-step decay: multiply LR by gamma at specific milestones
- OneCycleLR
- One cycle learning rate policy (popular for modern deep learning)
- PolynomialLR
- Polynomial learning rate decay
- ReduceLR
OnPlateau - Reduce learning rate on plateau (when validation loss stops improving)
- StepLR
- Step decay scheduler: multiply LR by gamma every step_size epochs
- Warmup
Scheduler - Warmup scheduler that gradually increases learning rate
Enums§
Traits§
- Learning
Rate Scheduler - Learning rate scheduler trait for adaptive learning rate adjustment during training