SciRS2 Optimization Module
scirs2-optimize (v0.1.4) is a production-ready optimization library providing comprehensive algorithms for unconstrained and constrained optimization, least-squares problems, root finding, and global optimization. Following the SciRS2 POLICY, it provides a high-performance Rust implementation of SciPy's optimization functionality with an ergonomic API, advanced features, excellent performance, and ecosystem consistency through scirs2-core abstractions.
Features
π Core Optimization Methods
Unconstrained Optimization
- Nelder-Mead simplex algorithm with adaptive parameters
- BFGS and L-BFGS quasi-Newton methods
- Powell's direction set method with line search
- Conjugate Gradient with Polak-Ribière and Fletcher-Reeves variants
- Full bounds support for all methods
Constrained Optimization
- SLSQP (Sequential Least Squares Programming)
- Trust Region Constrained algorithm
- Augmented Lagrangian methods
- Advanced constraint handling
Least Squares Optimization
- Levenberg-Marquardt with adaptive damping
- Trust Region Reflective algorithm
- Robust variants: Huber, Bisquare, Cauchy loss functions
- Weighted, bounded, separable, and total least squares
Root Finding
- Hybrid methods (modified Powell)
- Broyden's methods (Good and Bad variants)
- Anderson acceleration for iterative methods
- Krylov subspace methods (GMRES)
π Global Optimization
Metaheuristic Algorithms
- Differential Evolution with adaptive strategies
- Particle Swarm Optimization
- Simulated Annealing with adaptive cooling
- Basin-hopping with local search
- Dual Annealing combining fast and classical annealing
Bayesian Optimization
- Gaussian Process surrogate models
- Multiple acquisition functions (EI, LCB, PI)
- Automatic hyperparameter tuning
- Multi-start and clustering strategies
Multi-objective Optimization
- NSGA-II for bi-objective problems
- NSGA-III for many-objective problems
- Scalarization methods (weighted sum, Tchebycheff, Ξ΅-constraint)
β‘ Performance & Advanced Features
High Performance Computing
- Parallel evaluation with configurable worker threads
- SIMD-accelerated operations
- Memory-efficient algorithms for large-scale problems
- JIT compilation for performance-critical functions
Automatic Differentiation
- Forward-mode AD for gradient computation
- Reverse-mode AD for high-dimensional problems
- Sparse numerical differentiation
Stochastic Optimization
- SGD variants with momentum and Nesterov acceleration
- Adam, AdamW, RMSprop optimizers
- Mini-batch processing for large datasets
- Adaptive learning rate schedules
Specialized Methods
- Async optimization for slow function evaluations
- Sparse matrix optimization
- Multi-start strategies with clustering
Installation
Add the following to your Cargo.toml:
[]
= "0.1.4"
For advanced features, enable optional feature flags:
[]
= { = "0.1.4", = ["async"] }
Quick Start
Basic Unconstrained Optimization
use *;
// Minimize the Rosenbrock function
Global Optimization
use *;
Robust Least Squares
use *;
use Array1;
Bayesian Optimization
use *;
Why Choose scirs2-optimize?
π Production Ready
- Stable API with comprehensive error handling
- Extensive test coverage and numerical validation
- Memory-safe implementation with zero-cost abstractions
β‘ High Performance
- SIMD-accelerated operations where applicable
- Parallel evaluation support
- Memory-efficient algorithms for large-scale problems
- JIT compilation for critical performance paths
π§ Intelligent Defaults
- Robust numerical stability safeguards
- Adaptive parameters that work across problem types
- Automatic algorithm selection helpers
π§ Comprehensive Toolkit
- Complete SciPy optimize API coverage
- Advanced methods beyond SciPy (Bayesian optimization, multi-objective)
- Seamless integration with ndarray ecosystem
π Scientific Computing Focus
- IEEE 754 compliance and careful numerical handling
- Extensive documentation with mathematical background
- Benchmarked against reference implementations
Algorithm Selection Guide
| Problem Type | Recommended Method | Use Case |
|---|---|---|
| Smooth unconstrained | BFGS, L-BFGS |
Fast convergence with gradients |
| Noisy/non-smooth | Nelder-Mead, Powell |
Derivative-free robust optimization |
| Large-scale | L-BFGS, CG |
Memory-efficient for high dimensions |
| Global minimum | DifferentialEvolution, BayesianOptimization |
Avoid local minima |
| With constraints | SLSQP, TrustConstr |
Handle complex constraint sets |
| Least squares | LevenbergMarquardt |
Nonlinear curve fitting |
| With outliers | HuberLoss, BisquareLoss |
Robust regression |
Integration & Ecosystem
- Zero-copy integration with ndarray and nalgebra
- Feature flags for optional dependencies (async, parallel, SIMD)
- Workspace compatibility with other scirs2 modules
- Pure Rust implementation with OxiBLAS (no system dependencies)
- C API bindings available for integration with existing codebases
License
This project is Licensed under the Apache License 2.0. See LICENSE for details.
You can choose to use either license. See the LICENSE file for details.