Trait dfdx::optim::Optimizer

source ·
pub trait Optimizer<M, D: Storage<E>, E: Dtype> {
    // Required method
    fn update(
        &mut self,
        module: &mut M,
        gradients: &Gradients<E, D>
    ) -> Result<(), OptimizerUpdateError<D::Err>>;
}
Expand description

All optimizers must implement the update function, which takes a M and updates all of its parameters.

Notes

  1. Update takes ownership of Gradients, so update cannot be called with the same gradients object.

  2. Optimizer itself is generic over M, not the update method. This means a single optimizer object can only work on objects of type M. This also requires you to specify the model up front for the optimizer.

Required Methods§

source

fn update( &mut self, module: &mut M, gradients: &Gradients<E, D> ) -> Result<(), OptimizerUpdateError<D::Err>>

Updates all of module’s parameters using gradients.

Requires a &mut self because the optimizer may change some internally tracked values.

Implementors§

source§

impl<M: TensorCollection<E, D>, D: Device<E> + OneFillStorage<E>, E: Dtype> Optimizer<M, D, E> for RMSprop<M, E, D>

source§

impl<M: TensorCollection<E, D>, D: Device<E>, E: Dtype> Optimizer<M, D, E> for Adam<M, E, D>

source§

impl<M: TensorCollection<E, D>, D: Device<E>, E: Dtype> Optimizer<M, D, E> for Sgd<M, E, D>