yscv-optim
Optimizers, learning rate schedulers, and gradient clipping for neural network training.
use *;
let mut optimizer = new;
let scheduler = new;
for epoch in 0..100
Optimizers (8 + Lookahead meta-optimizer)
Sgd, Adam, AdamW, RmsProp, RAdam, Lars, Lamb, Adagrad, plus Lookahead<O> which wraps any of them.
LR Schedulers (11)
StepLr, MultiStepLr, ExponentialLr, CosineAnnealingLr, CosineAnnealingWarmRestarts, LinearWarmupLr, PolynomialDecayLr, OneCycleLr, ReduceLrOnPlateau, CyclicLr, LambdaLr.
Gradient Clipping
clip_grad_norm— L2 norm clippingclip_grad_value— element-wise value clipping
Tests
76 tests covering optimizer convergence, scheduler curves, clipping behavior.