Skip to main content

Module optimize

Module optimize 

Source
Expand description

Optimization & Root Finding — deterministic numerical solvers.

§Determinism Contract

All algorithms in this module are fully deterministic: given the same inputs and the same objective/gradient functions, they produce bit-identical results across runs. Floating-point reductions use binned_sum_f64 from the accumulator module to avoid ordering-dependent rounding.

§Scalar Root Finding

  • bisect — bisection method (guaranteed convergence for bracketed roots)
  • brentq — Brent’s method (IQI + bisection fallback, superlinear convergence)
  • newton_scalar — Newton-Raphson (quadratic convergence near root)
  • secant — secant method (superlinear convergence, no derivative needed)

§Unconstrained Optimization

Structs§

OptResult
Result of an unconstrained optimization run.

Functions§

bisect
Bisection method for scalar root finding.
brentq
Brent’s method for scalar root finding.
minimize_bfgs
BFGS quasi-Newton method with Armijo line search.
minimize_gd
Gradient descent with fixed learning rate.
minimize_lbfgs
L-BFGS (limited-memory BFGS) with Armijo line search.
minimize_nelder_mead
Nelder-Mead simplex method (derivative-free).
newton_scalar
Newton-Raphson method for scalar root finding.
penalty_objective
Penalty method for constrained optimization.
project_box
Project a point onto a box constraint [lower, upper].
projected_gd_step
Projected gradient descent step with box constraints.
secant
Secant method for scalar root finding.