Module dfdx::optim

source · []
Expand description

Optimizers such as Sgd and Adam that can optimize neural networks.

Structs

An implementation of the Adam optimizer from Adam: A Method for Stochastic Optimization

Implementation of Stochastic Gradient Descent. Based on pytorch’s implementation

Enums

Traits

All optimizers must implement the update function, which takes an object that implements CanUpdateWithGradients, and calls CanUpdateWithGradients::update.