yscv-optim
Optimizers, learning rate schedulers, and gradient clipping for neural network training.
use *;
let mut optimizer = new;
let scheduler = new;
for epoch in 0..100
Optimizers
SGD, Adam, AdamW, RMSprop, RAdam, LARS, LAMB, AdaGrad, Lookahead (wrapper)
LR Schedulers
Step, MultiStep, Exponential, Cosine, CosineWarmRestart, Linear, Polynomial, OneCycle, ReduceOnPlateau, Warmup
Gradient Clipping
clip_grad_norm— L2 norm clippingclip_grad_value— element-wise value clipping
Tests
76 tests covering optimizer convergence, scheduler curves, clipping behavior.