opt
Dense nonlinear optimization in Rust with:
Bfgsfor first-order dense quasi-Newton optimizationNewtonTrustRegionfor Hessian-based trust-region optimizationArcfor adaptive regularization with cubicsFixedPointfor bounded fixed-point iteration- automatic solver selection through
Problem,SecondOrderProblem, andoptimize
This crate is designed for practical nonlinear objectives, including optional simple box constraints, and is built around robustness for noisy or non-ideal functions.
This work is a rewrite of the original bfgs crate by Paul Kernfeld.
Features
- Strong Wolfe line search with practical fallback behavior for difficult first-order problems
- Dense BFGS with inverse-Hessian updates and stability safeguards
- Newton trust-region steps using supplied Hessians
- ARC with adaptive cubic regularization updates
- Fixed-point iteration with projection and step-norm termination
- Automatic solver selection for second-order objectives
- Optional simple box constraints with projected gradients
- Internal finite-difference support for cost-only and Hessian-optional objectives
- Structured error reporting with recoverable optimization failures
Usage
Add this to your Cargo.toml:
[]
= "0.2.0"
Example: First-Order Optimization
use ;
use ;
;
let x0 = array!;
let Solution = optimize
.with_tolerance
.with_max_iterations
.with_profile
.run
.expect;
assert!;
assert!;
assert!;
assert!;
assert!;
For cost-only objectives, wrap a ZerothOrderObjective with FiniteDiffGradient.
Example: Second-Order Optimization
Use SecondOrderProblem with optimize for automatic solver selection, or construct NewtonTrustRegion and Arc directly when you want explicit control over the algorithm choice.
Testing
Run the crate tests from the repository root:
The crate also includes comparison tests against SciPy through opt/optimization_harness.py.
License
Licensed under either of:
- Apache License, Version 2.0
- MIT license
at your option.