resopt 0.3.0

Declarative constrained residual optimization in Rust
Documentation
# resopt


`resopt` is a Rust crate for declarative residual optimization with optional linear constraints.

It models problems of the form:

```text
minimize_x   loss(Ax - b)
           + lambda/2 * ||Lx - x_ref||_2^2   (optional)
subject to   Cx = d
             Gx <= h
             l <= x <= u
```

The crate is designed around a small public API:

- build a residual problem from matrices and vectors
- add equality constraints, inequality constraints, and bounds
- validate and classify the problem structure
- solve with a backend such as Clarabel

## Features


- `clarabel` (default): enables the Clarabel backend

Current backend support:

- `Loss::L2Squared`: solved by the Clarabel backend, with optional Tikhonov regularization
- `Loss::L1`, `Loss::LInf`, `Loss::Huber`: accepted by the model API, but not yet solved by Clarabel

## Installation


```toml
[dependencies]
resopt = "0.3.0"
```

To disable the default backend:

```toml
[dependencies]
resopt = { version = "0.3.0", default-features = false }
```

## Quick Start


```rust
use resopt::{
    ConstrainedResidualProblemBuilder, Loss, Matrix, SolveStatus,
};

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let a = Matrix::from_row_major(
        3,
        2,
        vec![
            1.0, 0.0,
            0.0, 1.0,
            1.0, 1.0,
        ],
    )?;

    let problem = ConstrainedResidualProblemBuilder::new()
        .matrix(a)
        .target(vec![1.0, 2.0, 3.0])
        .loss(Loss::L2Squared)
        .build()?;

    let result = problem.solve()?;

    assert_eq!(result.status(), SolveStatus::Solved);
    assert!(result.solution().is_some());

    Ok(())
}
```

## Constrained Example


```rust
use resopt::{
    Bounds, ConstrainedResidualProblem, LinearEqualities, LinearInequalities, LinearResidual,
    Loss, Matrix, TikhonovRegularization,
};

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let residual = LinearResidual::new(
        Matrix::from_row_major(3, 2, vec![1.0, 0.0, 0.0, 1.0, 1.0, 1.0])?,
        vec![1.0, 2.0, 2.5],
    )?;

    let eq = LinearEqualities::new(
        Matrix::from_row_major(1, 2, vec![1.0, -1.0])?,
        vec![0.0],
    )?;

    let ineq = LinearInequalities::new(
        Matrix::from_row_major(1, 2, vec![1.0, 1.0])?,
        vec![3.0],
    )?;

    let bounds = Bounds::new(
        vec![Some(0.0), Some(0.0)],
        vec![Some(10.0), Some(10.0)],
    )?;

    let regularization = TikhonovRegularization::ridge(2, 0.1)?;

    let result = ConstrainedResidualProblem::new(residual, Loss::L2Squared)?
        .add_equalities(eq)?
        .add_inequalities(ineq)?
        .with_bounds(bounds)?
        .with_regularization(regularization)?
        .solve()?;

    println!("{:?}", result.status());
    Ok(())
}
```

See [`examples/basic_l2.rs`](examples/basic_l2.rs) and
[`examples/clarabel_constrained_l2.rs`](examples/clarabel_constrained_l2.rs) for runnable examples.

## Crate Status


The core modeling API is stable enough for experimentation, but the solver surface is still intentionally small.

For `Loss::L2Squared`, the Clarabel backend applies an explicit lifted formulation with
optional scaling through `SolveOptions`, and supports optional Tikhonov regularization.

If you publish results or build production workflows on top of `resopt`, pin the crate version and validate solver behavior on your problem family.

## Development


The repository includes a pre-commit hook installer:

```powershell
./scripts/install-git-hooks.ps1
```

The hook runs:

- `cargo fmt --check`
- `cargo clippy --all-targets --all-features -- -D warnings`
- `cargo test --all-features`