Expand description
Optimization & Root Finding — deterministic numerical solvers.
§Determinism Contract
All algorithms in this module are fully deterministic: given the same inputs
and the same objective/gradient functions, they produce bit-identical results
across runs. Floating-point reductions use binned_sum_f64 from the accumulator
module to avoid ordering-dependent rounding.
§Scalar Root Finding
bisect— bisection method (guaranteed convergence for bracketed roots)brentq— Brent’s method (IQI + bisection fallback, superlinear convergence)newton_scalar— Newton-Raphson (quadratic convergence near root)secant— secant method (superlinear convergence, no derivative needed)
§Unconstrained Optimization
minimize_gd— gradient descent with fixed learning rateminimize_bfgs— BFGS quasi-Newton with Armijo line searchminimize_lbfgs— limited-memory BFGS with m history vectorsminimize_nelder_mead— Nelder-Mead simplex (derivative-free)
Structs§
- OptResult
- Result of an unconstrained optimization run.
Functions§
- bisect
- Bisection method for scalar root finding.
- brentq
- Brent’s method for scalar root finding.
- minimize_
bfgs - BFGS quasi-Newton method with Armijo line search.
- minimize_
gd - Gradient descent with fixed learning rate.
- minimize_
lbfgs - L-BFGS (limited-memory BFGS) with Armijo line search.
- minimize_
nelder_ mead - Nelder-Mead simplex method (derivative-free).
- newton_
scalar - Newton-Raphson method for scalar root finding.
- penalty_
objective - Penalty method for constrained optimization.
- project_
box - Project a point onto a box constraint [lower, upper].
- projected_
gd_ step - Projected gradient descent step with box constraints.
- secant
- Secant method for scalar root finding.