Expand description
Optimizers for updating learnable parameters.
Per Tenet #4 (“Borrow-Safe Optimizers”): Optimizer::step takes
&mut GradientStore and selectively drains only the gradients belonging
to its registered parameters. Multiple optimizers can each drain their
subset from the same store (e.g., GAN generator vs. discriminator).
Re-exports§
pub use clip::clip_grad_norm_;pub use scheduler::CosineAnnealingLR;pub use scheduler::LRScheduler;pub use scheduler::StepLR;
Modules§
Structs§
Traits§
- Optimizer
- Trait for all optimizers.