# Crate metaheuristics_nature

source ·## Expand description

## §metaheuristics-nature

A collection of nature-inspired metaheuristic algorithms. This crate provides an objective function trait, well-known methods, and tool functions to implement your searching method.

This crate implemented the following algorithms:

- Real-coded Genetic Algorithm (RGA)
- Differential Evolution (DE)
- Particle Swarm Optimization (PSO)
- Firefly Algorithm (FA)
- Teaching-Learning Based Optimization (TLBO)

Side functions:

- Parallelable Seeded Random Number Generator (RNG)
- This RNG is reproducible in single-thread and multi-thread programming.

- Pareto front for Multi-Objective Optimization (MOO)
- You can return multiple fitness in the objective function.
- All fitness values will find the history-best solution as a set.

Each algorithm gives the same API and default parameters to help you test different implementations. For example, you can test another algorithm by replacing `Rga`

with `De`

.

```
use metaheuristics_nature as mh;
let mut report = Vec::with_capacity(20);
// Build and run the solver
let s = mh::Solver::build(mh::Rga::default(), mh::tests::TestObj)
.seed(0)
.task(|ctx| ctx.gen == 20)
.callback(|ctx| report.push(ctx.best.get_eval()))
.solve();
// Get the optimized XY value of your function
let (xs, p) = s.as_best();
// If `p` is a `WithProduct` type wrapped with the fitness value
let err = p.ys();
let result = p.as_result();
// Get the history reports
let y2 = &report[2];
```

#### §What kinds of problems can be solved?

If your problem can be simulated and evaluated, the optimization method efficiently finds the best design! 🚀

Assuming that your simulation can be done with a function `f`

, by inputting the parameters `X`

and the evaluation value `y`

, then the optimization method will try to adjust `X={x0, x1, ...}`

to obtain the smallest `y`

. Their relationship can be written as `f(X) = y`

.

The number of the parameters `X`

is called “dimension”. Imagine `X`

is the coordinate in the multi-dimension, and `y`

is the weight of the “point.” If the dimension increases, the problem will be more challenging to search.

The “metaheuristic” algorithms use multiple points to search for the minimum value, which detects the local gradient across the most feasible solutions and keeps away from the local optimum, even with an unknown gradient or feasible region.

Please have a look at the API documentation for more information.

#### §Gradient-based Methods

For more straightforward functions, for example, if the 1st derivative function is known, gradient-based methods are recommended for the fastest speed. Such as OSQP.

## §Terminologies

For unifying the terms, in this documentation,

- “Iteration” is called “generation”. (Avoid confusion with iterators)
- “Function” that evaluates the design is called “objective function”.
- “Return value” of the objective function is called “fitness”.

## §Algorithms

There are two traits `Algorithm`

and `AlgCfg`

.
The previous is used to design the optimization method,
and the latter is the setting interface.

`Solver`

is a simple interface for obtaining the solution, or analyzing
the result. This type allows you to use the pre-defined methods without
importing any traits.

All provided methods are listed in the module `methods`

.

For making your owned method, please see `prelude`

.

## §Objective Function

For a quick demo with callable object, please see `Fx`

.

You can define your question as an objective function through implementing
`ObjFunc`

, and then the upper bound, lower bound, and an objective
function `ObjFunc::fitness()`

returns `Fitness`

should be defined.

## §Random Function

This crate uses a 64bit ChaCha algorithm (`random::Rng`

) to generate
uniform random values. Before that, a random seed is required. The seed is
generated by `getrandom`

crate, please see its support platform.

## §Features

The crate features:

`std`

: Default feature. Enable standard library function, such as timing and threading. If`std`

is disabled, crate “libm” will be enabled for the math functions.`rayon`

: Enable parallel computation via`rayon`

. Disable it for the platform that doesn’t supported threading, or if your objective function is not complicate enough. This feature require`std`

feature.`clap`

: Add CLI argument support for the provided algorithms and their options.

## §Compatibility

If you are using this crate for providing objective function, other downstream crates of yours may have some problems with compatibility.

The most important thing is using a stable version, specifying the major
version number. Then re-export (`pub use`

) this crate for the downstream
crates.

This crate does the same things on `rand`

and `rayon`

.

## Re-exports§

## Modules§

- Pre-implemented optimization methods.
- Single/Multi-objective best containers.
- A prelude module for algorithm implementation.
- Random number generator module.

## Macros§

- A tool macro used to generate multiple builder functions (methods).

## Structs§

- A basic context type of the algorithms.
- A quick interface help to create objective function from a callable object.
- A
`Fitness`

type carrying a multi-objective`Fitness`

value. Make it become a single objective task via using`Fitness::eval()`

. - A public API for using optimization methods.
- Collect configuration and build the solver.
- A
`Fitness`

type carrying final results.

## Enums§

- Initial pool generating options.

## Traits§

- Algorithm configurations. A trait for preparing the algorithm.
- The methods of the metaheuristic algorithms.
- A problem is well bounded.
- Trait for dominance comparison.
- MaybeParallel
`rayon`

A marker trait for parallel computation. - A trait for the objective function.

## Functions§

- A function generates a Gaussian pool.
- A function generates a uniform pool.

## Type Aliases§

- A
`SolverBuilder`

that use a boxed algorithm.