Module schedulers

Source

Structs§

ConstantLR
Constant learning rate (no scheduling)
CosineAnnealingLR
Cosine annealing scheduler with warm restarts
CosineAnnealingWarmRestarts
Cosine annealing with warm restarts
ExponentialLR
Exponential decay scheduler: multiply LR by gamma every epoch
LinearLR
Linear learning rate schedule
MultiStepLR
Multi-step decay: multiply LR by gamma at specific milestones
OneCycleLR
One cycle learning rate policy (popular for modern deep learning)
ReduceLROnPlateau
Reduce learning rate on plateau (when validation loss stops improving)
StepLR
Step decay scheduler: multiply LR by gamma every step_size epochs

Enums§

AnnealStrategy

Traits§

LearningRateScheduler
Learning rate scheduler trait for adaptive learning rate adjustment during training