1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
//! Derivative-free global optimization (metaheuristics).
//!
//! This module provides metaheuristic algorithms for black-box optimization
//! where gradients are unavailable or the landscape is highly multimodal.
//!
//! # Algorithm Categories
//!
//! ## Perturbative Metaheuristics
//! Modify complete solutions through perturbation operators:
//! - [`DifferentialEvolution`] - Population-based, excellent for continuous HPO
//! - [`ParticleSwarm`] - Swarm intelligence with velocity updates
//! - [`SimulatedAnnealing`] - Single-point, Metropolis acceptance
//! - [`GeneticAlgorithm`] - Selection, crossover, mutation
//! - [`HarmonySearch`] - Music-inspired memory-based optimization
//!
//! ## Benchmark Functions
//! Standard test functions for algorithm evaluation:
//! - [`benchmarks`] - CEC 2013 benchmark suite (Sphere, Rosenbrock, Rastrigin, etc.)
//!
//! ## Constructive Metaheuristics (Phase 3)
//! Build solutions incrementally:
//! - `AntColony` - Pheromone-guided construction
//! - `TabuSearch` - Memory-based local search
//!
//! # Search Space Abstraction
//!
//! Unlike gradient-based optimizers that assume continuous spaces,
//! metaheuristics support diverse problem representations:
//!
//! ```
//! use aprender::metaheuristics::SearchSpace;
//!
//! // Continuous optimization (HPO)
//! let hpo_space = SearchSpace::continuous(5, -10.0, 10.0);
//!
//! // Binary feature selection
//! let feature_space = SearchSpace::binary(100);
//!
//! // Permutation (TSP)
//! let tsp_space = SearchSpace::permutation(50);
//! ```
//!
//! # Example: Hyperparameter Optimization
//!
//! ```
//! use aprender::metaheuristics::{DifferentialEvolution, SearchSpace, Budget, PerturbativeMetaheuristic};
//!
//! // Define search space for learning rate and regularization
//! let space = SearchSpace::Continuous {
//! dim: 2,
//! lower: vec![1e-5, 1e-6],
//! upper: vec![1e-1, 1e-2],
//! };
//!
//! // Objective: minimize validation loss (simulated)
//! let objective = |params: &[f64]| {
//! let lr = params[0];
//! let reg = params[1];
//! // Simulated loss landscape
//! (lr - 0.01).powi(2) + (reg - 0.001).powi(2) + 0.1 * (lr * 100.0).sin()
//! };
//!
//! let mut de = DifferentialEvolution::default();
//! let result = de.optimize(&objective, &space, Budget::Evaluations(5000));
//!
//! assert!(result.objective_value < 0.1); // Reasonable tolerance for small budget
//! ```
//!
//! # References
//!
//! - Storn & Price (1997): Differential Evolution
//! - Kennedy & Eberhart (1995): Particle Swarm Optimization
//! - Kirkpatrick et al. (1983): Simulated Annealing
//! - Hansen (2016): CMA-ES Tutorial
pub use BinaryGA;
pub use ;
pub use ;
pub use ;
pub use ;
pub use ;
pub use GeneticAlgorithm;
pub use HarmonySearch;
pub use ;
pub use ParticleSwarm;
pub use SimulatedAnnealing;
pub use SearchSpace;
pub use ;