qopt 0.11.0

A simple optimization library.
Documentation

QOpt

A simple optimization package.

Optimization Paradigms

The latest version of QOpt supports the following paradigms.

  • Steepest Descent (Gradient Descent)
  • Newton's Method
  • Genetic Optimization
  • Simulated Annealing

Getting Started

Importing maria-linalg

You must import the latest version of the Rust crate maria-linalg in order to use this package.

Creating a Function

First, you must define a function struct that satisfies trait Function. This represents a function that accepts an N-dimensional vector and outputs a scalar.

Function has three methods.

  • Function::objective (required). Evaluates to the function output (f64).
  • Function::gradient (optional). Evaluates to the function gradient (Vector<N>).
  • Function::hessian (optional). Evaluates to the function Hessian (Matrix<N>).

See the example below. Note that you must also import maria_linalg::Vector and (only if you implement the Hessian Function::hessian) maria_linalg::Matrix.

use qopt::Function;
use maria_linalg::{Matrix, Veector};

/// Number of dimensions of input vector.
const N: usize = 6;

pub struct MyFunction { }

impl MyFunction {
    pub fn new() -> Self {
        Self { }
    }
}

impl Function<N> for MyFunction {
    fn objective(&self, input: Vector<N>) -> f64 {
        // Required
    }

    fn gradient(&self, input: Vector<N>) -> Vector<N> {
        // Optional
    }

    fn hessian(&self, input: Vector<N>) -> Matrix<N> {
        // Optional
    }
}

Creating an Optimizer

Once you have a struct that satisfies Function, you can create your Optimizer.

use qopt::Optimizer;

/// Number of individuals per optimization iteration.
/// 
/// For deterministic methods (gradient descent or Newton's method), this should be 1.
/// For stochastic methods (genetic optimization or simulated annealing), this should be about 100.
const POPULATION: usize = 100;

fn main() {
    let f = MyFunction::new();
    let optimizer: Optimizer<N, POPULATION> = Optimizer::new(Box::new(f));

    // An initial guess
    let input = Vector::zero();
    let output = optimizer.optimize(input);

    println!("{}", output);
}

Running the Optimizer

You are now ready to run the optimizer using command-line arguments.

The structure for a command to execute the optimizer is as follows.

$ cargo run --release --quiet -- [--argument value]

Alternatively, if you have written a binary, you may run the binary according to the same rules. Suppose the binary is named myoptimizer.

$ myoptimizer [--argument value]

Command-Line Arguments

The following are permitted command-line arguments and values. Note that all arguments are optional.

--paradigm [string]

Optimization paradigm.

Defaults to steepest-descent.

Accepts the following options.

  • steepest-descent. Steepest (gradient) descent. It is recommended (but not required) to implement Function::gradient for this.
  • newton. Newton's method. It is recommended (but not required) to implement Function::gradient and Function::hessian for this.
  • genetic. Genetic algorithm.
  • simulated-annealing. Simulated annealing.

--criterion [float]

Gradient-based convergence criterion. When the gradient is less than this value, the optimizer halts. Note that this requires a locally convex function.

Defaults to 0.001.

Accepts a floating-point number.

--maxiter [integer]

Maximum number of optimization iterations.

Defaults to 100.

Accepts an integer.

--maxtemp [float]

Maximum temperature. This is only used for the simulated annealing paradigm.

Defaults to 1.0.

Accepts a floating-point number.

--stdev [float]

Standard deviation of mutations. This is only used for stochastic methods (genetic optimization and simulated annealing).

Defaults to 1.0.

Accepts a floating-point number.