1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
//! # Optimization Problem Trait Module
//!
//! This module defines the [`Problem`] trait, which provides a standardized interface
//! for optimization problems in the globalsearch-rs library. Any optimization problem
//! must implement this trait to be compatible with the OQNLP algorithm.
//!
//! ## Problem Trait Overview
//!
//! The [`Problem`] trait defines the mathematical structure of an optimization problem
//! through several key methods:
//!
//! ### Required Methods
//! - [`objective`](Problem::objective): The objective function to minimize
//! - [`variable_bounds`](Problem::variable_bounds): Box constraints for variables
//!
//! ### Optional Methods (Depending on Local Solver Requirements)
//! - [`gradient`](Problem::gradient): First-order derivatives
//! - [`hessian`](Problem::hessian): Second-order derivatives
//! - [`constraints`](Problem::constraints): General inequality constraints, only valid with the COBYLA local solver
//!
//! ## Implementation Guidelines
//!
//! ### Objective Function
//! - **Return Type**: `Result<f64, EvaluationError>` for error handling
//! - **Convention**: Lower values indicate better solutions (minimization)
//! - **Error Handling**: Return `EvaluationError` for invalid inputs or computation failures
//!
//! ### Variable Bounds
//! - **Format**: 2D array where each row is `[lower_bound, upper_bound]`
//! - **Requirement**: Must be finite and well-defined
//! - **Purpose**: Defines the feasible region for optimization
//!
//! ### Constraints (Optional, only valid with the COBYLA local solver)
//! - **Sign Convention**:
//! - `g(x) ≥ 0`: Constraint satisfied
//! - `g(x) < 0`: Constraint violated
//! - **Return Type**: Vector of constraint function closures
//! - **Use Cases**: Nonlinear inequality constraints beyond simple bounds
//!
//! ## Example: Six-Hump Camel Function
//!
//! ```rust
//! /// References:
//! ///
//! /// Molga, M., & Smutnicki, C. Test functions for optimization needs (April 3, 2005), pp. 11-12. Retrieved January 2025, from https://robertmarks.org/Classes/ENGR5358/Papers/functions.pdf
//!
//! use globalsearch::problem::Problem;
//! use globalsearch::types::EvaluationError;
//! use ndarray::{array, Array1, Array2};
//!
//! #[derive(Debug, Clone)]
//! pub struct SixHumpCamel;
//!
//! impl Problem for SixHumpCamel {
//! fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError> {
//! Ok(
//! (4.0 - 2.1 * x[0].powi(2) + x[0].powi(4) / 3.0) * x[0].powi(2)
//! + x[0] * x[1]
//! + (-4.0 + 4.0 * x[1].powi(2)) * x[1].powi(2),
//! )
//! }
//!
//! // Calculated analytically, reference didn't provide gradient
//! fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError> {
//! Ok(array![
//! (8.0 - 8.4 * x[0].powi(2) + 2.0 * x[0].powi(4)) * x[0] + x[1],
//! x[0] + (-8.0 + 16.0 * x[1].powi(2)) * x[1]
//! ])
//! }
//!
//! // Calculated analytically, reference didn't provide hessian
//! fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError> {
//! Ok(array![
//! [
//! (4.0 * x[0].powi(2) - 4.2) * x[0].powi(2)
//! + 4.0 * (4.0 / 3.0 * x[0].powi(3) - 4.2 * x[0]) * x[0]
//! + 2.0 * (x[0].powi(4) / 3.0 - 2.1 * x[0].powi(2) + 4.0),
//! 1.0
//! ],
//! [1.0, 40.0 * x[1].powi(2) + 2.0 * (4.0 * x[1].powi(2) - 4.0)],
//! ])
//! }
//!
//! fn variable_bounds(&self) -> Array2<f64> {
//! array![[-3.0, 3.0], [-2.0, 2.0]]
//! }
//! }
//! ```
use crateEvaluationError;
use ;
/// # Trait for optimization problems
///
/// This trait defines the methods that an optimization problem must implement, including the objective function, gradient, hessian and variable bounds.
///
/// The objective function is the function to minimize, evaluated at a given point x (`Array1<f64>`).
///
/// The gradient is the derivative of the objective function, evaluated at a given point x (`Array1<f64>`).
///
/// The hessian is the square matrix of the second order partial derivatives of the objective function, evaluated at a given point x (`Array1<f64>`).
///
/// The variable bounds are the lower and upper bounds for the optimization problem.
///
/// Constraint functions for constrained optimization problems can also be defined using the `constraints` method.
///
/// The default implementation of the gradient and hessian returns an error indicating the gradient and hessian are not implemented.
/// Some local solvers require the gradient and hessian to be implemented, while for others it isn't needed.
/// You should check the documentation of the local solver you are using to know if the gradient and hessian are needed.