OptObserver

Trait OptObserver 

Source
pub trait OptObserver: Send {
    // Required method
    fn on_step(&self, values: &HashMap<String, VariableEnum>, iteration: usize);

    // Provided methods
    fn set_iteration_metrics(
        &self,
        _cost: f64,
        _gradient_norm: f64,
        _damping: Option<f64>,
        _step_norm: f64,
        _step_quality: Option<f64>,
    ) { ... }
    fn set_matrix_data(
        &self,
        _hessian: Option<SparseColMat<usize, f64>>,
        _gradient: Option<Mat<f64>>,
    ) { ... }
    fn on_optimization_complete(
        &self,
        _values: &HashMap<String, VariableEnum>,
        _iterations: usize,
    ) { ... }
}
Expand description

Observer trait for monitoring optimization progress.

Implement this trait to create custom observers that are notified at each optimization iteration. Observers receive the current variable values and iteration number, enabling real-time monitoring, visualization, logging, or custom analysis.

§Design Notes

  • Observers should be lightweight and non-blocking
  • Errors in observers should not crash optimization (handle internally)
  • For expensive operations (file I/O, network), consider buffering
  • Observers receive immutable references (cannot modify optimization state)

§Thread Safety

Observers must be Send to support parallel optimization in the future. Use interior mutability (RefCell, Mutex) if you need to mutate state.

Required Methods§

Source

fn on_step(&self, values: &HashMap<String, VariableEnum>, iteration: usize)

Called after each optimization iteration.

§Arguments
  • values - Current variable values (manifold states)
  • iteration - Current iteration number (0 = initial values, 1+ = after steps)
§Implementation Guidelines
  • Keep this method fast to avoid slowing optimization
  • Handle errors internally (log warnings, don’t panic)
  • Don’t mutate values (you receive &HashMap)
  • Consider buffering expensive operations
§Examples
use apex_solver::observers::OptObserver;
use apex_solver::core::problem::VariableEnum;
use std::collections::HashMap;

struct SimpleLogger;

impl OptObserver for SimpleLogger {
    fn on_step(&self, values: &HashMap<String, VariableEnum>, iteration: usize) {
        // Track optimization progress
    }
}

Provided Methods§

Source

fn set_iteration_metrics( &self, _cost: f64, _gradient_norm: f64, _damping: Option<f64>, _step_norm: f64, _step_quality: Option<f64>, )

Set iteration metrics for visualization and monitoring.

This method is called before on_step to provide optimization metrics such as cost, gradient norm, damping parameter, etc. Observers can use this data for visualization, logging, or analysis.

§Arguments
  • cost - Current cost function value
  • gradient_norm - L2 norm of the gradient vector
  • damping - Damping parameter (for Levenberg-Marquardt, may be None for other solvers)
  • step_norm - L2 norm of the parameter update step
  • step_quality - Step quality metric (e.g., rho for trust region methods)
§Default Implementation

The default implementation does nothing, allowing simple observers to ignore metrics.

Source

fn set_matrix_data( &self, _hessian: Option<SparseColMat<usize, f64>>, _gradient: Option<Mat<f64>>, )

Set matrix data for advanced visualization.

This method provides access to the Hessian matrix and gradient vector for observers that want to visualize matrix structure or perform advanced analysis.

§Arguments
  • hessian - Sparse Hessian matrix (J^T * J)
  • gradient - Gradient vector (J^T * r)
§Default Implementation

The default implementation does nothing, allowing simple observers to ignore matrices.

Source

fn on_optimization_complete( &self, _values: &HashMap<String, VariableEnum>, _iterations: usize, )

Called when optimization completes.

This method is called once at the end of optimization, after all iterations are complete. Use this for final visualization, cleanup, or summary logging.

§Arguments
  • values - Final optimized variable values
  • iterations - Total number of iterations performed
§Default Implementation

The default implementation does nothing, allowing simple observers to ignore completion.

§Examples
use apex_solver::observers::OptObserver;
use apex_solver::core::problem::VariableEnum;
use std::collections::HashMap;

struct FinalStateLogger;

impl OptObserver for FinalStateLogger {
    fn on_step(&self, _values: &HashMap<String, VariableEnum>, _iteration: usize) {}

    fn on_optimization_complete(&self, values: &HashMap<String, VariableEnum>, iterations: usize) {
        println!("Optimization completed after {} iterations with {} variables",
                 iterations, values.len());
    }
}

Implementors§