1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
//! Collection of various optimization algorithms and strategies.
//!
//! # Building Blocks
//!
//! Each central primitive is specified by a trait:
//!
//! - **`Function`** - Specifies a function that can be minimized
//! - **`Function1`** - Extends a `Function` by its first derivative
//! - **`Summation`** - Represents a summation of functions, exploited, e.g., by SGD
//! - **`Summation1`** - Analogous to `Function` and `Function1` but for `Summation`
//! - **`Minimizer`** - A minimization algorithm
//! - **`Evaluation`** - A function evaluation `f(x) = y` that is returned by a `Minimizer`
//! - **`Func`** - A new-type wrapper for the `Function` trait
//! - **`NumericalDifferentiation`** - Provides numerical differentiation for arbitrary `Function`s
//!
//! # Algorithms
//!
//! Currently, the following algorithms are implemented. This list is not final and being
//! expanded over time.
//!
//! - **`GradientDescent`** - Iterative gradient descent minimization, supporting various line
//! search methods:
//! - *`FixedStepWidth`* - No line search is performed, but a fixed step width is used
//! - *`ExactLineSearch`* - Exhaustive line search over a set of step widths
//! - *`ArmijoLineSearch`* - Backtracking line search using the Armijo rule as stopping
//! criterion
//! - **`StochasticGradientDescent`** - Iterative stochastic gradient descenent minimazation,
//! currently using a fixed step width
extern crate log;
extern crate rand;
extern crate rand_pcg;
pub use ;
pub use NumericalDifferentiation;
pub use ;
pub use GradientDescent;
pub use StochasticGradientDescent;