SciRS2 Optimization Module
scirs2-optimize
is a comprehensive optimization library providing algorithms for unconstrained and constrained optimization, least-squares problems, and root finding. It aims to provide a Rust implementation of SciPy's optimization functionality with a similar API.
Features
The module is divided into several key components:
Unconstrained Optimization
Algorithms for minimizing scalar functions of one or more variables without constraints:
- Nelder-Mead simplex algorithm
- BFGS (Broyden-Fletcher-Goldfarb-Shanno) algorithm
- Powell's method
- Conjugate Gradient method
Constrained Optimization
Algorithms for minimizing scalar functions with constraints:
- SLSQP (Sequential Least Squares Programming)
- Trust Region Constrained algorithm
Least Squares Optimization
Algorithms for solving nonlinear least squares problems:
- Levenberg-Marquardt algorithm
- Trust Region Reflective algorithm
Root Finding
Algorithms for finding roots of nonlinear functions:
- Hybrid method (modified Powell algorithm)
- Broyden's method (Good and Bad variants)
- Anderson acceleration
- Krylov subspace methods (GMRES)
Installation
Add the following to your Cargo.toml
:
[]
= "0.1.0"
Usage Examples
Unconstrained Optimization
use array;
use ;
// Define a function to minimize (e.g., Rosenbrock function)
Constrained Optimization
use array;
use ;
// Define an objective function
// Define a constraint: x[0] + x[1] <= 3
Least Squares Optimization
use ;
use ;
// Define residual function
// Define Jacobian (optional)
Root Finding
use ;
use ;
// Define a function for which we want to find the root
Numerical Stability
The optimization algorithms are designed with numerical stability in mind:
- Gradient calculations include checks for small values to avoid division by zero
- Trust region methods handle degenerate cases robustly
- Line search strategies have safeguards against infinite loops and numerical issues
- Appropriate defaults are chosen to ensure algorithms work across a wide range of problems
Algorithm Selection
Choose the appropriate algorithm based on your problem:
-
Unconstrained optimization:
Nelder-Mead
: Robust, doesn't require derivatives, but can be slow for high-dimensional problemsBFGS
: Fast convergence for smooth functions, requires only function values and gradientsPowell
: Good for functions where derivatives are unavailable or unreliableCG
(Conjugate Gradient): Efficient for large-scale problems
-
Constrained optimization:
SLSQP
: Efficient for problems with equality and inequality constraintsTrust-Constr
: Trust-region algorithm that handles nonlinear constraints well
-
Least squares:
LevenbergMarquardt
: Robust for most nonlinear least squares problemsTrustRegionReflective
: Good for bound-constrained problems
-
Root finding:
Hybr
: Robust hybrid method (modified Powell algorithm)Broyden1
/Broyden2
: Good for systems where Jacobian evaluation is expensiveAnderson
: Accelerates convergence for iterative methodsKrylov
: Efficient for large-scale systems
Error Handling
All functions return OptimizeResult<OptimizeResults<T>>
where:
OptimizeResult
is a Result type that can contain errors like convergence failuresOptimizeResults<T>
contains optimization results including the solution, function value, and convergence information
Performance Considerations
- Most algorithms have been optimized for numerical stability and efficiency
- The code leverages Rust's strong type system and memory safety features
- Performance is comparable to other Rust optimization libraries
License
This crate is part of the SciRS2 project and is licensed under the Apache License 2.0.