pub trait SingleValuedFunction: Debug + Sync {
// Required method
fn single_run(&self, phenotype_expressed_values: &[f64]) -> f64;
// Provided method
fn function_floor(&self) -> f64 { ... }
}Expand description
Trait for optimization functions that return a single scalar value to minimize.
This is the primary trait users implement to define their optimization problem.
The genetic algorithm will evolve parameters to minimize the value returned by
single_run.
§Common Use Cases
- Mathematical Functions: Minimizing surfaces like Rosenbrock, Rastrigin, Ackley
- Parameter Tuning: Finding optimal configuration values for models
- Engineering Design: Minimizing cost, weight, or error metrics
- Machine Learning: Hyperparameter optimization
§Thread Safety
Functions must be Sync as they are called concurrently from multiple threads
during fitness evaluation.
§Examples
§Simple Quadratic Function
use hill_descent_lib::SingleValuedFunction;
#[derive(Debug)]
struct Quadratic;
impl SingleValuedFunction for Quadratic {
fn single_run(&self, params: &[f64]) -> f64 {
// Minimize f(x, y) = x² + y²
// Global minimum: f(0, 0) = 0
params.iter().map(|x| x * x).sum()
}
}
// Use with setup_world:
use hill_descent_lib::{setup_world, GlobalConstants, TrainingData};
let param_range = vec![-10.0..=10.0, -10.0..=10.0];
let constants = GlobalConstants::new(100, 10);
let mut world = setup_world(¶m_range, constants, Box::new(Quadratic));
for _ in 0..100 {
world.training_run(TrainingData::None { floor_value: 0.0 });
}
assert!(world.get_best_score() < 0.01); // Should find near-zero minimum§Function with Custom Floor
use hill_descent_lib::SingleValuedFunction;
#[derive(Debug)]
struct ShiftedFunction;
impl SingleValuedFunction for ShiftedFunction {
fn single_run(&self, params: &[f64]) -> f64 {
// Function with minimum value of -5.0
params.iter().map(|x| x * x).sum::<f64>() - 5.0
}
fn function_floor(&self) -> f64 {
-5.0 // Specify theoretical minimum
}
}§Complex Optimization Problem
use hill_descent_lib::SingleValuedFunction;
#[derive(Debug)]
struct ModelFitness {
target_data: Vec<f64>,
}
impl SingleValuedFunction for ModelFitness {
fn single_run(&self, params: &[f64]) -> f64 {
// params[0]: learning_rate, params[1]: regularization, etc.
// Return mean squared error or similar metric
let predictions = self.run_model(params);
self.compute_error(&predictions)
}
}
impl ModelFitness {
fn run_model(&self, _params: &[f64]) -> Vec<f64> {
// Model execution logic
vec![0.0; self.target_data.len()]
}
fn compute_error(&self, predictions: &[f64]) -> f64 {
// Error calculation
predictions.iter()
.zip(&self.target_data)
.map(|(p, t)| (p - t).powi(2))
.sum::<f64>() / self.target_data.len() as f64
}
}§Implementation Notes
- The function should be deterministic - same inputs must produce same output
- Avoid expensive operations if possible - called millions of times during optimization
- Return
f64::INFINITYfor invalid parameter combinations - Consider implementing
function_floorif your function has a known theoretical minimum
§See Also
crate::WorldFunction- For multi-output functions (automatically implemented)crate::setup_world- Initialize optimization with your functionsuper::World::training_run- Run optimization epochs
Required Methods§
Sourcefn single_run(&self, phenotype_expressed_values: &[f64]) -> f64
fn single_run(&self, phenotype_expressed_values: &[f64]) -> f64
Evaluates the function for given parameter values.
This is the core method that defines your optimization problem. The genetic algorithm will call this method millions of times with different parameter combinations, seeking the combination that produces the minimum return value.
§Parameters
phenotype_expressed_values- The parameter values to evaluate. Length matches the number of ranges provided tosetup_world. Values are guaranteed to be within the bounds specified by those ranges.
§Returns
The fitness score to minimize. Lower values indicate better solutions.
§Examples
use hill_descent_lib::SingleValuedFunction;
#[derive(Debug)]
struct Distance;
impl SingleValuedFunction for Distance {
fn single_run(&self, params: &[f64]) -> f64 {
// Euclidean distance from origin
params.iter().map(|x| x * x).sum::<f64>().sqrt()
}
}§Thread Safety
This method must be thread-safe as it’s called concurrently. Avoid mutable state or use appropriate synchronization primitives.
Provided Methods§
Sourcefn function_floor(&self) -> f64
fn function_floor(&self) -> f64
Returns the theoretical minimum value (floor) of this function.
This is used to validate that computed values are not below the theoretical minimum, which would indicate a bug in the function implementation.
§Default Implementation
The default implementation returns 0.0, maintaining backward compatibility with
existing functions that assume a minimum of zero.
§Examples
For a function with a known minimum of -5.0:
fn function_floor(&self) -> f64 {
-5.0
}