Module wyrm::optim[][src]

Optimization module.

Contains a number of optimizers.

Structs

Adagrad

Adagrad optimizer, scaled the learning rate by the inverse of previously accumulated gradients.

Adam

ADAM optimizer.

SGD

Standard stochastic gradient descent optimizer with a fixed learning rate.

SynchronizedOptimizer

Synchronized optimizer wrapper.

Enums

Optimizers

Enum containing all optimizers.

Traits

Optimizer

Core trait implemented by all optimizer methods.

Synchronizable

Trait implemented by synchronizable optimizers.