Crate stochastic_optimizers

Source
Expand description

This crate provides implementations of common stochstic gradient optimization algorithms. They are designed to be lightweight, flexible and easy to use.

Currently implemted:

The crate does not provide automatic differentiation, the gradient is given by the user.

§Examples

use stochastic_optimizers::{Adam, Optimizer};
//minimise the function (x-4)^2
let start = -3.0;
let mut optimizer = Adam::new(start, 0.1);

for _ in 0..10000 {
   let current_paramter = optimizer.parameters();

   // d/dx (x-4)^2
   let gradient = 2.0 * current_paramter - 8.0;
 
   optimizer.step(&gradient);
}

assert_eq!(optimizer.into_parameters(), 4.0);

The parameters are owned by the optimizer and a reference can be optained by parameters(). After optimization they can be optained by into_parameters().

§What types can be optimized

All types which impement the Parameters trait can be optimized. Implementations for the standart types f32, f64, Vec<T : Parameters> and [T : Parameters ; N] are provided.

Its realativly easy to implement it for custom types, see Parameters.

§ndarray

By enabling the ndarray feature you can use Array as Parameters

Structs§

AdaGrad
Adam
Implements the Adam algorithm
AdamBuilder
See Adam::builder
SGD

Traits§

Optimizer
Represents common functionality shared by all optimizers
Parameters
Makes a type be used as a parameter in an optimizer. The type should represent a owned collection of scalar variables. For example Vec<f64> or [f64;10]