Module wyrm::optim [] [src]

Optimization module.

Contains a number of optimizers.

Structs

Adagrad

Adagrad optimizer, scaled the learning rate by the inverse of previously accumulated gradients.

Adam

ADAM optimizer.

SGD

Standard stochastic gradient descent optimizer with a fixed learning rate.

SynchronizationBarrier

Traits

Optimizer

Core trait implemented by all optimizer methods.