Expand description
Optimization algorithms module - PyTorch-compatible optimizers
This module provides a modular structure for optimization algorithms:
base- Base PyOptimizer class and common functionalitysgd- Stochastic Gradient Descent optimizeradam- Adam and AdamW optimizersadagrad- Adagrad optimizerrmsprop- RMSprop optimizer
Re-exports§
pub use adagrad::PyAdaGrad;pub use adam::PyAdam;pub use adam::PyAdamW;pub use base::PyOptimizer;pub use rmsprop::PyRMSprop;pub use sgd::PySGD;
Modules§
- adagrad
- AdaGrad optimizer
- adam
- Adam and AdamW optimizers
- base
- Base optimizer implementation - Foundation for all PyTorch-compatible optimizers
- rmsprop
- RMSprop optimizer
- sgd
- SGD (Stochastic Gradient Descent) optimizer
Functions§
- register_
optim_ module - Register the optim module with Python