Skip to main content

Module optim

Module optim 

Source
Expand description

Optimizers for updating learnable parameters.

Per Tenet #4 (“Borrow-Safe Optimizers”): Optimizer::step takes &mut GradientStore and selectively drains only the gradients belonging to its registered parameters. Multiple optimizers can each drain their subset from the same store (e.g., GAN generator vs. discriminator).

Re-exports§

pub use clip::clip_grad_norm_;
pub use scheduler::CosineAnnealingLR;
pub use scheduler::LRScheduler;
pub use scheduler::StepLR;

Modules§

clip
Gradient clipping utilities.
scheduler
Learning rate schedulers.

Structs§

Adam
AdamW
SGD

Traits§

Optimizer
Trait for all optimizers.