Module schedulers

Module schedulers 

Source

Structs§

ConstantLR
Constant learning rate (no scheduling)
CosineAnnealingLR
Cosine annealing scheduler with warm restarts
CosineAnnealingWarmRestarts
Cosine annealing with warm restarts
CyclicalLR
Cyclical learning rate policy with different modes
ExponentialLR
Exponential decay scheduler: multiply LR by gamma every epoch
LRScheduleVisualizer
Learning rate schedule visualization helper
LinearLR
Linear learning rate schedule
MultiStepLR
Multi-step decay: multiply LR by gamma at specific milestones
OneCycleLR
One cycle learning rate policy (popular for modern deep learning)
PolynomialLR
Polynomial learning rate decay
ReduceLROnPlateau
Reduce learning rate on plateau (when validation loss stops improving)
StepLR
Step decay scheduler: multiply LR by gamma every step_size epochs
WarmupScheduler
Warmup scheduler that gradually increases learning rate

Enums§

AnnealStrategy
CyclicalMode
ScaleMode

Traits§

LearningRateScheduler
Learning rate scheduler trait for adaptive learning rate adjustment during training