# Crate gomez

Expand description

gomez is a framework and implementation for mathematical optimization and solving non-linear systems of equations.

The library is written completely in Rust. Its focus is on being useful for practical problems and having API that is simple for easy cases as well as flexible for complicated ones. The name stands for global optimization & non-linear equations solving, with a few typos.

### Practical problems

The main goal is to be useful for practical problems. This is manifested by the following features:

• Derivative-free. No algorithm requires an analytical derivative (gradient, Hessian, Jacobian). Methods that use derivatives approximate it using finite difference method1.
• Constraints support. It is possible to specify the problem domain with constraints2, which is necessary for many engineering applications.
• Non-naive implementations. The code is not a direct translation of a textbook pseudocode. It’s written with performance in mind and applies important techniques from numerical mathematics. It also tries to handle situations that hurt the methods but occur in practice.

1 There is a plan to provide ways to override this approximation with a real derivative.

2 Currently, only unconstrained and box-bounded domains are supported.

### Algorithms

• Trust region – Recommended method to be used as a default and it will just work in most cases.
• LIPO – Global optimization algorithm useful for searching good initial guesses in combination with a numerical algorithm.
• Steffensen – Fast and lightweight method for solving one-dimensional systems of equations.
• Nelder-Mead – Direct search optimization method that does not use any derivatives.

This list will be extended in the future. But at the same time, having as many algorithms as possible is not the goal. Instead, the focus is on providing quality implementations of battle-tested methods.

### Mathematical optimization

Given a function f: D → R from some domain D (in Rn) to the real numbers and an initial point x0, the goal is to find a point x’ such that f(x’) is a minimum. Note that gomez does not guarantee that the minimum is global, although the focus is on global optimization techniques.

#### Example

``````use gomez::nalgebra as na;
use gomez::{Domain, Function, OptimizerDriver, Problem};
use na::{Dyn, IsContiguous};

// Objective function is represented by a struct.
struct Rosenbrock {
a: f64,
b: f64,
}

impl Problem for Rosenbrock {
// Field type, f32 or f64.
type Field = f64;

// Domain of the function.
fn domain(&self) -> Domain<Self::Field> {
Domain::unconstrained(2)
}
}

impl Function for Rosenbrock {
// Body of the function, taking x and returning f(x).
fn apply<Sx>(&self, x: &na::Vector<Self::Field, Dyn, Sx>) -> Self::Field
where
Sx: na::Storage<Self::Field, Dyn> + IsContiguous,
{
(self.a - x[0]).powi(2) + self.b * (x[1] - x[0].powi(2)).powi(2)
}
}

let f = Rosenbrock { a: 1.0, b: 1.0 };
let mut optimizer = OptimizerDriver::builder(&f)
.with_initial(vec![-10.0, -5.0])
.build();

let (x, fx) = optimizer
.find(|state| state.fx() <= 1e-6 || state.iter() >= 100)
.expect("optimizer error");

println!("f(x) = {fx}\tx = {x:?}");``````

See `OptimizerDriver` and `OptimizerBuilder` for additional options.

### Systems of equations

Given a vector function r: D → Rn, with ri: D → R from some domain D (in Rn) to the real numbers for i = 1, 2, …, n, and an initial point x0, the goal is to find a point x’ such that r(x) = 0. Note that there is no constraint on the form of the equations ri (compare with specialized solvers for systems of linear equations).

#### Example

``````use gomez::nalgebra as na;
use gomez::{Domain, Problem, SolverDriver, System};
use na::{Dyn, IsContiguous};

// System of equations is represented by a struct.
struct Rosenbrock {
a: f64,
b: f64,
}

impl Problem for Rosenbrock {
// Field type, f32 or f64.
type Field = f64;

// Domain of the system.
fn domain(&self) -> Domain<Self::Field> {
Domain::unconstrained(2)
}
}

impl System for Rosenbrock {
// Evaluation of the system (computing the residuals).
fn eval<Sx, Srx>(
&self,
x: &na::Vector<Self::Field, Dyn, Sx>,
rx: &mut na::Vector<Self::Field, Dyn, Srx>,
) where
Sx: na::storage::Storage<Self::Field, Dyn> + IsContiguous,
Srx: na::storage::StorageMut<Self::Field, Dyn>,
{
rx[0] = (self.a - x[0]).powi(2);
rx[1] = self.b * (x[1] - x[0].powi(2)).powi(2);
}
}

let r = Rosenbrock { a: 1.0, b: 1.0 };
let mut solver = SolverDriver::builder(&r)
.with_initial(vec![-10.0, -5.0])
.build();

let (x, norm) = solver
.find(|state| state.norm() <= 1e-6 || state.iter() >= 100)
.expect("solver error");

println!("|| r(x) || = {norm}\tx = {x:?}");``````

See `SolverDriver` and `SolverBuilder` for additional options.

### Custom algorithms

It is possible to create a custom algorithm by implementing the `Optimizer` and/or `Solver` trait. Then it can be used by the driver as any other algorithm provided by gomez. Go see the documentation of the traits to get more details.

``````let f = Rosenbrock { a: 1.0, b: 1.0 };
let mut optimizer = OptimizerDriver::builder(&f)
.with_algo(|f, dom| MyAlgo::new(f, dom))
.build();``````

If you implement an algorithm, please reach out to discuss if we could include it in gomez.

Listed not in priority order.

• Homotopy continuation method to compare the performance with the trust region method
• Experimentation with various global optimization techniques for initial guess search
• Evolutionary/nature-inspired algorithms
• Bayesian optimization
• Focus on initial guesses search and tools for analysis in general

## Re-exports

• `pub use driver::OptimizerDriver;`
• `pub use driver::SolverDriver;`
• `pub use nalgebra;`

## Modules

• The collection of implemented algorithms.
• Various supporting analyses.
• Tools for derivative-based methods.
• High-level API for optimization and solving.