Expand description
Sample problems for testing optimization, along with their gradient functions
A set of classic objective functions and their gradients, along with uniform and gaussian noise functions. Handy for testing and benchmarking.
See https://en.wikipedia.org/wiki/Test_functions_for_optimization
The sigmoid function can be used to bound the output range to [0,1] whilst preserving minima.
Functions§
- abs
- n-dimensional absolute value: f(x) = |x₁| + |x₂| + … + |xₙ|
- abs_
grad - n-dim absolute value gradient: ∇f(x) = sgn(x) = [signum(x₁), signum(x₂), …, signum(xₙ)]
- gaussian_
noise - Normal with mean = 0, and stand deviation 1
- rosenbrock
- rosenbrock: f(x) = sum(i=0,i<N/2) (x{2i} -a)^2 + b(x{2i+1} - [x{2i}^2)^2
- rosenbrock_
grad - rosenbrock gradient
- sigmoid
- σ(x) = 1 / [1+e^(-x)]
- sigmoid_
grad - Sigmoid function gradient: ∇σ(x) = σ’(x) = σ(x) * (1 - σ(x))
- sphere
- n-sphere centered at origin: f(x) = x₁² + x₂² + … + xₙ²
- sphere_
grad - n-sphere gradient: ∇f(x) = [2x₁, 2x₂, …, 2xₙ]
- uniform_
noise - Generates random noise in the f64 range [-1, +1]