Skip to main content

ExtendedModel

Trait ExtendedModel 

Source
pub trait ExtendedModel {
    // Provided methods
    fn extended_update64(&mut self, _params: &[f64]) { ... }
    fn extended_update32(&mut self, _params: &[f32]) { ... }
    fn extended_cost64(&self, _params: &[f64]) -> f64 { ... }
    fn extended_cost32(&self, _params: &[f32]) -> f32 { ... }
    fn extended_compute64(&mut self, _params: &[f64]) { ... }
    fn extended_compute32(&mut self, _params: &[f32]) { ... }
    fn extended_jacobian64(
        &mut self,
        _params: &[f64],
        _rows: &mut Vec<JacobianRow<f64>>,
        _cid: &mut u32,
    ) { ... }
    fn extended_jacobian32(
        &mut self,
        _params: &[f32],
        _rows: &mut Vec<JacobianRow<f32>>,
        _cid: &mut u32,
    ) { ... }
}
Expand description

Extension hooks for custom constraints on #[arael(root, extended)] structs.

Use this when you need constraints that can’t be expressed via #[arael(constraint(...))] at compile time — for example, constraints parsed from user input at runtime, or constraints that need access to the full root struct.

A key use case is runtime differentiation: parse an equation string with arael_sym::parse, symbolically differentiate with E::diff, then evaluate numerically each solver iteration. This powers the parametric expression dimensions in arael-sketch (where the user types d0 * 2 + 3 as a dimension value) and the runtime_fit_demo example (which accepts an arbitrary curve-fitting equation from the command line).

To use: mark the root struct with #[arael(root, extended)] and implement this trait. The macro-generated LmProblem calls these methods at the appropriate points in the optimization loop. Default implementations are no-ops, so you only override what you need.

To write custom gradient and Hessian contributions, add a TripletBlock field to the root struct. The macro automatically zeroes and accumulates it. In extended_compute, push residual contributions into it via TripletBlock::add_residual.

§Execution order

Each solver iteration runs:

  1. Model::update — copies params into working values
  2. extended_update — set up derived state before calculations
  3. zero_blocks — zeros all Hessian blocks (including TripletBlocks)
  4. Macro-generated constraint loops — fill SelfBlock/CrossBlock
  5. extended_compute — fill TripletBlocks with custom residuals
  6. accumulate_blocks — reads all blocks into global grad/Hessian

For cost evaluation: Model::updateextended_update → macro-generated cost loop → extended_cost.

§Example

Robust curve fitting where the equation is parsed at runtime. The residual and its derivatives are symbolic expressions evaluated numerically each iteration (see examples/runtime_fit_demo.rs):

#[arael::model]
#[arael(root, extended)]
struct RegressionModel {
    coeffs: refs::Vec<Coefficient>,         // optimizable parameters
    hb: TripletBlock<f64>,                  // Gauss-Newton accumulator
    residual_expr: Option<arael_sym::E>,    // parsed equation
    derivs: Vec<(String, u32, arael_sym::E)>, // (name, param_index, d_residual/d_param)
    data: Vec<(f64, f64)>,
    param_names: Vec<String>,
}

// Setup: parse equation, differentiate symbolically (once)
let expr = arael_sym::parse("a * x + b").unwrap();
let residual = (expr - arael_sym::symbol("y")) / arael_sym::constant(sigma);
let dr_da = residual.diff("a");
let dr_db = residual.diff("b");

impl ExtendedModel for RegressionModel {
    fn extended_compute64(&mut self, params: &[f64]) {
        // Evaluate symbolically-differentiated expressions numerically
        for &(x, y) in &self.data {
            vars.insert("x", x);
            vars.insert("y", y);
            let r = self.residual_expr.eval(&vars).unwrap();
            let dr: Vec<f64> = self.derivs.iter()
                .map(|(_, _, d)| d.eval(&vars).unwrap()).collect();
            let indices: Vec<u32> = self.derivs.iter()
                .map(|(_, idx, _)| *idx).collect();
            self.hb.add_residual(r, &indices, &dr);
        }
    }

    fn extended_cost64(&self, params: &[f64]) -> f64 {
        // Sum of squared residuals
        self.data.iter().filter_map(|&(x, y)| {
            vars.insert("x", x);
            vars.insert("y", y);
            let r = self.residual_expr.eval(&vars).ok()?;
            Some(r * r)
        }).sum()
    }
}

See examples/runtime_fit_demo.rs for the complete working example, and arael-sketch-solver for a production use of this pattern with parametric expression dimensions.

Provided Methods§

Source

fn extended_update64(&mut self, _params: &[f64])

Called after update64, before cost/constraint calculations. Use to compute derived state that constraints depend on.

Source

fn extended_update32(&mut self, _params: &[f32])

Called after update32, before cost/constraint calculations.

Source

fn extended_cost64(&self, _params: &[f64]) -> f64

Additional cost contribution (f64). Called after the macro-generated cost loop.

Source

fn extended_cost32(&self, _params: &[f32]) -> f32

Additional cost contribution (f32).

Source

fn extended_compute64(&mut self, _params: &[f64])

Compute custom constraint residuals (f64). Called after macro-generated constraints and before accumulation. Push gradient/Hessian contributions into a TripletBlock field.

Source

fn extended_compute32(&mut self, _params: &[f32])

Compute custom constraint residuals (f32).

Source

fn extended_jacobian64( &mut self, _params: &[f64], _rows: &mut Vec<JacobianRow<f64>>, _cid: &mut u32, )

Append Jacobian rows for runtime constraints (f64). cid is the constraint counter – increment per constraint object.

Source

fn extended_jacobian32( &mut self, _params: &[f32], _rows: &mut Vec<JacobianRow<f32>>, _cid: &mut u32, )

Append Jacobian rows for runtime constraints (f32).

Implementors§