Skip to main content

Module optim

Module optim 

Source
Expand description

Optimization algorithms module - PyTorch-compatible optimizers

This module provides a modular structure for optimization algorithms:

  • base - Base PyOptimizer class and common functionality
  • sgd - Stochastic Gradient Descent optimizer
  • adam - Adam and AdamW optimizers
  • adagrad - Adagrad optimizer
  • rmsprop - RMSprop optimizer

Re-exports§

pub use adagrad::PyAdaGrad;
pub use adam::PyAdam;
pub use adam::PyAdamW;
pub use base::PyOptimizer;
pub use rmsprop::PyRMSprop;
pub use sgd::PySGD;

Modules§

adagrad
AdaGrad optimizer
adam
Adam and AdamW optimizers
base
Base optimizer implementation - Foundation for all PyTorch-compatible optimizers
rmsprop
RMSprop optimizer
sgd
SGD (Stochastic Gradient Descent) optimizer

Functions§

register_optim_module
Register the optim module with Python