vrp_core/solver/mod.rs
1//! The *solver* module contains basic building blocks for a metaheuristic among with the default
2//! implementation.
3//!
4//! # Metaheuristic
5//!
6//! A metaheuristic is a high-level algorithmic framework that provides a set of guidelines or strategies
7//! to develop heuristic optimization algorithms. Examples of metaheuristics include genetic/evolutionary
8//! algorithms, tabu search, simulated annealing, variable neighborhood search, (adaptive) large
9//! neighborhood search, ant colony optimization, etc.
10//!
11//!
12//! # Multi-objective decision maker
13//!
14//! Most VRPs, frequently used to model real cases, are set up with a single objective (e.g. minimizing
15//! the cost of the solution), however the majority of the problems encountered in logistics industry,
16//! are multi-objective in nature as the complexity of real-life logistics planning often cannot be
17//! reduced to cost only. Such non-cost factors are:
18//!
19//! - **balancing work across multiple workers**
20//! - **minimization or maximization of fleet usage**
21//! - **minimization of unassigned jobs**
22//!
23//! In most of the cases, these additional factors are contradicting to the cost minimization
24//! objective which, in fact, leads to nontrivial multi-objective optimization problem, where no
25//! single solution exists that simultaneously optimizes each objective.
26//!
27//! # Evolutionary algorithm
28//!
29//! An evolutionary algorithm (EA) is a generic population-based metaheuristic optimization algorithm.
30//! This crate provides a custom implementation of EA which can be divided into the following steps:
31//!
32//! - **initialization**: on this step, an initial population is created using different construction
33//! heuristics.
34//! - **main loop begin**: enter an evolution loop
35//! - **selection**: an individual is selected from population. Best-fit individuals have more
36//! chances to be selected.
37//! - **mutation**: a mutation operator is applied to selected individual. Default implementation
38//! uses `ruin and recreate` principle described in next section.
39//! - **population adjustments**: new individual is added to population, then the population is
40//! sorted and shrinked to keep it under specific size limits with best-fit individuals and
41//! some intermediate.
42//! - **main loop end**: exit evolution loop when one of termination criteria are met. See `termination`
43//! module for details.
44//!
45//! As there is no crossover operator involved and offspring is produced from one parent, this algorithm
46//! can be characterized as parthenogenesis based EA. This approach eliminates design of feasible
47//! crossover operator which is a challenging task in case of VRP.
48//!
49//! # Population
50//!
51//! A custom algorithm is implemented to maintain diversity and guide the search maintaining trade
52//! of between exploration and exploitation of solution space. See `rosomaxa` crate for details.
53//!
54//!
55//! # Ruin and Recreate principle
56//!
57//! A **ruin and recreate** principle is introduced by [`Schrimpf et al. (2000)`] and key idea here
58//! is to ruin a quite large fraction of the solution and try to restore the solution as best as it
59//! is possible in order to get a new solution better than the previous one. Original algorithm can
60//! be described as a large neighborhood search that combines elements of simulated annealing and
61//! threshold-accepting algorithms, but this crate only reuses ruin/recreate idea as a mutation
62//! operator.
63//!
64//! [`Schrimpf et al. (2000)`]: https://www.sciencedirect.com/science/article/pii/S0021999199964136
65//!
66//!
67//! # Additionally..
68//!
69//! The solver is not limited by R&R principle, additionally it utilizes some other heuristics
70//! and their combinations. They are picked based on their performance in terms of search quality and
71//! latency introduced. Reinforcement technics are used here.
72//!
73
74extern crate rand;
75
76use crate::construction::heuristics::InsertionContext;
77use crate::models::{GoalContext, Problem, Solution};
78use crate::solver::search::Recreate;
79use rosomaxa::evolution::*;
80use rosomaxa::prelude::*;
81use rosomaxa::{get_default_population, TelemetryHeuristicContext};
82use std::any::Any;
83use std::collections::HashMap;
84use std::sync::Arc;
85
86pub use self::heuristic::*;
87use rosomaxa::population::Rosomaxa;
88use rosomaxa::utils::Timer;
89
90pub mod processing;
91pub mod search;
92
93mod heuristic;
94
95/// A type which encapsulates information needed to perform solution refinement process.
96pub struct RefinementContext {
97 /// Original problem definition.
98 pub problem: Arc<Problem>,
99 /// An environmental context.
100 pub environment: Arc<Environment>,
101 /// A collection of data associated with refinement process.
102 pub state: HashMap<String, Box<dyn Any + Sync + Send>>,
103 /// Provides some basic implementation of context functionality.
104 inner_context: TelemetryHeuristicContext<GoalContext, InsertionContext>,
105}
106
107/// Defines instant refinement speed type.
108#[derive(Clone)]
109pub enum RefinementSpeed {
110 /// Slow speed with ratio estimation
111 Slow(Float),
112
113 /// Moderate speed.
114 Moderate,
115}
116
117impl RefinementContext {
118 /// Creates a new instance of `RefinementContext`.
119 pub fn new(
120 problem: Arc<Problem>,
121 population: TargetPopulation,
122 telemetry_mode: TelemetryMode,
123 environment: Arc<Environment>,
124 ) -> Self {
125 let inner_context =
126 TelemetryHeuristicContext::new(problem.goal.clone(), population, telemetry_mode, environment.clone());
127 Self { problem, environment, inner_context, state: Default::default() }
128 }
129
130 /// Adds solution to population.
131 pub fn add_solution(&mut self, solution: InsertionContext) {
132 self.inner_context.add_solution(solution);
133 }
134}
135
136impl HeuristicContext for RefinementContext {
137 type Objective = GoalContext;
138 type Solution = InsertionContext;
139
140 fn objective(&self) -> &Self::Objective {
141 self.inner_context.objective()
142 }
143
144 fn selected<'a>(&'a self) -> Box<dyn Iterator<Item = &Self::Solution> + 'a> {
145 self.inner_context.selected()
146 }
147
148 fn ranked<'a>(&'a self) -> Box<dyn Iterator<Item = &Self::Solution> + 'a> {
149 self.inner_context.ranked()
150 }
151
152 fn statistics(&self) -> &HeuristicStatistics {
153 self.inner_context.statistics()
154 }
155
156 fn selection_phase(&self) -> SelectionPhase {
157 self.inner_context.selection_phase()
158 }
159
160 fn environment(&self) -> &Environment {
161 self.inner_context.environment()
162 }
163
164 fn on_initial(&mut self, solution: Self::Solution, item_time: Timer) {
165 self.inner_context.on_initial(solution, item_time)
166 }
167
168 fn on_generation(&mut self, offspring: Vec<Self::Solution>, termination_estimate: Float, generation_time: Timer) {
169 self.inner_context.on_generation(offspring, termination_estimate, generation_time)
170 }
171
172 fn on_result(self) -> HeuristicResult<Self::Objective, Self::Solution> {
173 self.inner_context.on_result()
174 }
175}
176
177impl Stateful for RefinementContext {
178 type Key = String;
179
180 fn set_state<T: 'static + Send + Sync>(&mut self, key: Self::Key, state: T) {
181 self.state.insert(key, Box::new(state));
182 }
183
184 fn get_state<T: 'static + Send + Sync>(&self, key: &Self::Key) -> Option<&T> {
185 self.state.get(key).and_then(|v| v.downcast_ref::<T>())
186 }
187
188 fn state_mut<T: 'static + Send + Sync, F: Fn() -> T>(&mut self, key: Self::Key, inserter: F) -> &mut T {
189 // NOTE may panic if casting fails
190 self.state.entry(key).or_insert_with(|| Box::new(inserter())).downcast_mut::<T>().unwrap()
191 }
192}
193
194/// Wraps recreate method as `InitialOperator`
195pub struct RecreateInitialOperator {
196 recreate: Arc<dyn Recreate>,
197}
198
199impl RecreateInitialOperator {
200 /// Creates a new instance of `RecreateInitialOperator`.
201 pub fn new(recreate: Arc<dyn Recreate>) -> Self {
202 Self { recreate }
203 }
204}
205
206impl InitialOperator for RecreateInitialOperator {
207 type Context = RefinementContext;
208 type Objective = GoalContext;
209 type Solution = InsertionContext;
210
211 fn create(&self, heuristic_ctx: &Self::Context) -> Self::Solution {
212 let insertion_ctx = InsertionContext::new(heuristic_ctx.problem.clone(), heuristic_ctx.environment.clone());
213 self.recreate.run(heuristic_ctx, insertion_ctx)
214 }
215}
216
217/// Solves a Vehicle Routing Problem and returns a _(solution, its cost)_ pair in case of success
218/// or error description, if solution cannot be found.
219pub struct Solver {
220 problem: Arc<Problem>,
221 config: EvolutionConfig<RefinementContext, GoalContext, InsertionContext>,
222}
223
224impl Solver {
225 /// Tries to create an instance of `Solver` from provided config.
226 pub fn new(
227 problem: Arc<Problem>,
228 config: EvolutionConfig<RefinementContext, GoalContext, InsertionContext>,
229 ) -> Self {
230 Self { problem, config }
231 }
232
233 /// Solves a Vehicle Routing Problem and returns a feasible solution in case of success
234 /// or error description if solution cannot be found.
235 pub fn solve(self) -> GenericResult<Solution> {
236 (self.config.context.environment.logger)(&format!(
237 "total jobs: {}, actors: {}",
238 self.problem.jobs.size(),
239 self.problem.fleet.actors.len()
240 ));
241
242 let (mut solutions, metrics) = EvolutionSimulator::new(self.config)?.run()?;
243
244 // NOTE select the first best individual from population
245 let insertion_ctx = if solutions.is_empty() { None } else { solutions.drain(0..1).next() }
246 .ok_or_else(|| "cannot find any solution".to_string())?;
247
248 let solution = (insertion_ctx, metrics).into();
249
250 Ok(solution)
251 }
252}