Expand description
Dense nonlinear optimization solvers in Rust.
This crate provides:
Problem+optimize: the default API for solver selection that is hard to misuse.SecondOrderProblem+optimize: automatic selection for Hessian-aware objectives.Bfgs: dense quasi-Newton optimization with robust hybrid line search.NewtonTrustRegion: Hessian-based trust-region optimization.Arc: Adaptive Regularization with Cubics (ARC).
All solvers support optional simple box constraints and are built around practical robustness for noisy/non-ideal objectives.
§Features
Bfgshybrid line search: Strong Wolfe with nonmonotone (GLL) Armijo, approximate-Wolfe, and gradient-reduction acceptors, plus a best-seen salvage path and a small probing grid.Bfgstrust-region (dogleg) fallback with CG-based solves on the inverse Hessian, diagonal regularization, and scaled-identity resets under severe noise.NewtonTrustRegion: projected Steihaug-Toint trust-region iterations using objective Hessians.Arc: cubic-regularized model steps with adaptive regularization updates (rho,sigma).- Profile-based heuristic policy selection for rough, piecewise-flat objectives.
- Adaptive strategy switching (Wolfe <-> Backtracking) based on success streaks (no timed flips).
- Optional box constraints with projected gradients and coordinate clamping.
- Optional flat-bracket midpoint acceptance inside zoom.
- Stochastic jiggling of step sizes on persistent flats.
- Multi-direction (coordinate) rescue when progress is flat.
§Defaults (key settings)
- Line search: Strong Wolfe primary; GLL nonmonotone Armijo; approximate‑Wolfe and gradient‑drop acceptors; probing grid; keep‑best salvage.
- Trust region: dogleg fallback enabled; Δ₀ = min(1, 10/||g₀||); adaptive by ρ; SPD enforcement and scaled‑identity resets when needed.
- Tolerances:
c1=1e-4,c2=0.9; heuristics selected byProfile. - Zoom midpoint: flat‑bracket midpoint acceptance under profile control.
- Stochastic jiggling: default ON with scale 1e‑3 (only after repeated flats in backtracking).
- Coordinate rescue: default ON (only after two consecutive flat accepts).
- Strategy switching: switch Wolfe<->Backtracking only on success/failure streaks (no timed flips).
- Clear, configurable builder API, and robust termination with informative errors.
§Example
Minimize the Rosenbrock function, a classic test case for optimization algorithms.
use opt::{
optimize, FirstOrderObjective, FirstOrderSample, MaxIterations, Problem, Profile, Solution,
Tolerance,
};
use ndarray::{array, Array1};
struct Rosenbrock;
impl opt::ZerothOrderObjective for Rosenbrock {
fn eval_cost(&mut self, x: &Array1<f64>) -> Result<f64, opt::ObjectiveEvalError> {
let a = 1.0;
let b = 100.0;
Ok((a - x[0]).powi(2) + b * (x[1] - x[0].powi(2)).powi(2))
}
}
impl FirstOrderObjective for Rosenbrock {
fn eval_grad(&mut self, x: &Array1<f64>) -> Result<FirstOrderSample, opt::ObjectiveEvalError> {
let a = 1.0;
let b = 100.0;
let f = (a - x[0]).powi(2) + b * (x[1] - x[0].powi(2)).powi(2);
let gradient = array![
-2.0 * (a - x[0]) - 4.0 * b * (x[1] - x[0].powi(2)) * x[0],
2.0 * b * (x[1] - x[0].powi(2)),
];
Ok(FirstOrderSample { value: f, gradient })
}
}
// Set the initial guess.
let x0 = array![-1.2, 1.0];
// Run the solver.
let Solution {
final_point: x_min,
final_value,
iterations,
..
} = optimize(Problem::new(x0, Rosenbrock))
.with_tolerance(Tolerance::new(1e-6).unwrap())
.with_max_iterations(MaxIterations::new(100).unwrap())
.with_profile(Profile::Robust)
.run()
.expect("BFGS failed to solve");
println!(
"Found minimum f([{:.3}, {:.3}]) = {:.4} in {} iterations.",
x_min[0], x_min[1], final_value, iterations
);
// The known minimum is at [1.0, 1.0].
assert!((x_min[0] - 1.0).abs() < 1e-5);
assert!((x_min[1] - 1.0).abs() < 1e-5);Structs§
- Arc
- A configurable Adaptive Regularization with Cubics (ARC) solver.
- Bfgs
- A configurable BFGS solver.
- Bounds
- Finite
Diff Gradient - First
Order Sample - Fixed
Point - Fixed
Point Sample - MaxIterations
- Newton
Trust Region - Problem
- Second
Order Problem - Second
Order Sample - Solution
- A summary of a successful solver run.
- Symmetric
Hessian Mut - Tolerance
Enums§
- ArcError
- Auto
Second Order Error - Auto
Second Order Solver - Bfgs
Error - An error type for clear diagnostics.
- Bounds
Error - Config
Error - Fixed
Point Error - Fixed
Point Status - Line
Search Failure Reason - Matrix
Error - Newton
Trust Region Error - Objective
Eval Error - Profile
- Stationarity
Kind