Module optim

Source
Expand description

§Optimization algorithms

for training neural networks

Adam - Adaptive learning rate optimization algorithm for training neural networks

SGD - Stochastic Gradient Descent optimization algorithm for training neural network

Structs§

Adam
Adaptive Moment Estimation (ADAM) optimizer.
SGD
Stochastic Gradient Descent(SGD)

Traits§

Optimizer