pub struct Study<V = f64>where
V: PartialOrd,{ /* private fields */ }Expand description
A study manages the optimization process, tracking trials and their results.
The study is parameterized by the objective value type V, which defaults to f64.
The only constraint on V is PartialOrd, allowing comparison of objective values
to determine which trial is best.
When V = f64, the study passes trial history to the sampler for informed
parameter suggestions (e.g., TPE sampler uses history to guide sampling).
§Examples
use optimizer::{Direction, Study};
// Create a study to minimize an objective function
let study: Study<f64> = Study::new(Direction::Minimize);
assert_eq!(study.direction(), Direction::Minimize);Implementations§
Source§impl<V> Study<V>where
V: PartialOrd,
impl<V> Study<V>where
V: PartialOrd,
Sourcepub fn best_trial(&self) -> Result<CompletedTrial<V>>where
V: Clone,
pub fn best_trial(&self) -> Result<CompletedTrial<V>>where
V: Clone,
Return the trial with the best objective value.
The “best” trial depends on the optimization direction:
Direction::Minimize: Returns the trial with the lowest objective value.Direction::Maximize: Returns the trial with the highest objective value.
When constraints are present, feasible trials always rank above infeasible trials. Among infeasible trials, those with lower total constraint violation are preferred.
§Errors
Returns Error::NoCompletedTrials if no trials have been completed.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
// Error when no trials completed
assert!(study.best_trial().is_err());
let x_param = FloatParam::new(0.0, 1.0);
let mut trial1 = study.create_trial();
let _ = x_param.suggest(&mut trial1);
study.complete_trial(trial1, 0.8);
let mut trial2 = study.create_trial();
let _ = x_param.suggest(&mut trial2);
study.complete_trial(trial2, 0.3);
let best = study.best_trial().unwrap();
assert_eq!(best.value, 0.3); // Minimize: lower is betterSourcepub fn best_value(&self) -> Result<V>where
V: Clone,
pub fn best_value(&self) -> Result<V>where
V: Clone,
Return the best objective value found so far.
The “best” value depends on the optimization direction:
Direction::Minimize: Returns the lowest objective value.Direction::Maximize: Returns the highest objective value.
§Errors
Returns Error::NoCompletedTrials if no trials have been completed.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Maximize);
// Error when no trials completed
assert!(study.best_value().is_err());
let x_param = FloatParam::new(0.0, 1.0);
let mut trial1 = study.create_trial();
let _ = x_param.suggest(&mut trial1);
study.complete_trial(trial1, 0.3);
let mut trial2 = study.create_trial();
let _ = x_param.suggest(&mut trial2);
study.complete_trial(trial2, 0.8);
let best = study.best_value().unwrap();
assert_eq!(best, 0.8); // Maximize: higher is betterSourcepub fn top_trials(&self, n: usize) -> Vec<CompletedTrial<V>>where
V: Clone,
pub fn top_trials(&self, n: usize) -> Vec<CompletedTrial<V>>where
V: Clone,
Return the top n trials sorted by objective value.
For Direction::Minimize, returns trials with the lowest values.
For Direction::Maximize, returns trials with the highest values.
Only includes completed trials (not failed or pruned).
If fewer than n completed trials exist, returns all of them.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0);
for val in [5.0, 1.0, 3.0] {
let mut t = study.create_trial();
let _ = x.suggest(&mut t);
study.complete_trial(t, val);
}
let top2 = study.top_trials(2);
assert_eq!(top2.len(), 2);
assert!(top2[0].value <= top2[1].value);Source§impl<V> Study<V>
impl<V> Study<V>
Sourcepub fn param_importance(&self) -> Vec<(String, f64)>
pub fn param_importance(&self) -> Vec<(String, f64)>
Compute parameter importance scores using Spearman rank correlation.
For each parameter, the absolute Spearman correlation between its values and the objective values is computed across all completed trials. Scores are normalized so they sum to 1.0 and sorted in descending order.
Parameters that appear in fewer than 2 trials are omitted.
Returns an empty Vec if the study has fewer than 2 completed trials.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0).name("x");
study
.optimize(20, |trial: &mut optimizer::Trial| {
let xv = x.suggest(trial)?;
Ok::<_, optimizer::Error>(xv * xv)
})
.unwrap();
let importance = study.param_importance();
assert_eq!(importance.len(), 1);
assert_eq!(importance[0].0, "x");Sourcepub fn fanova(&self) -> Result<FanovaResult>
pub fn fanova(&self) -> Result<FanovaResult>
Compute parameter importance using fANOVA (functional ANOVA) with default configuration.
Fits a random forest to the trial data and decomposes variance into
per-parameter main effects and pairwise interaction effects. This is
more accurate than correlation-based importance (Self::param_importance)
and can detect non-linear relationships and parameter interactions.
§Errors
Returns crate::Error::NoCompletedTrials if fewer than 2 trials have completed.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0).name("x");
let y = FloatParam::new(0.0, 10.0).name("y");
study
.optimize(30, |trial: &mut optimizer::Trial| {
let xv = x.suggest(trial)?;
let yv = y.suggest(trial)?;
Ok::<_, optimizer::Error>(xv * xv + 0.1 * yv)
})
.unwrap();
let result = study.fanova().unwrap();
assert!(!result.main_effects.is_empty());Sourcepub fn fanova_with_config(&self, config: &FanovaConfig) -> Result<FanovaResult>
pub fn fanova_with_config(&self, config: &FanovaConfig) -> Result<FanovaResult>
Compute parameter importance using fANOVA with custom configuration.
See Self::fanova for details. The FanovaConfig
allows tuning the number of trees, tree depth, and random seed.
§Errors
Returns crate::Error::NoCompletedTrials if fewer than 2 trials have completed.
Source§impl<V> Study<V>
impl<V> Study<V>
Sourcepub fn to_csv(&self, writer: impl Write) -> Result<()>
pub fn to_csv(&self, writer: impl Write) -> Result<()>
Write completed trials to a writer in CSV format.
Columns: trial_id, value, state, then one column per unique
parameter label, then one column per unique user-attribute key.
Parameters without labels use a generated name (param_<id>).
Pruned trials have an empty value cell.
§Errors
Returns an I/O error if writing fails.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0).name("x");
let mut trial = study.create_trial();
let _ = x.suggest(&mut trial);
study.complete_trial(trial, 0.42);
let mut buf = Vec::new();
study.to_csv(&mut buf).unwrap();
let csv = String::from_utf8(buf).unwrap();
assert!(csv.contains("trial_id"));Sourcepub fn summary(&self) -> String
pub fn summary(&self) -> String
Return a human-readable summary of the study.
The summary includes:
- Optimization direction and total trial count
- Breakdown by state (complete, pruned) when applicable
- Best trial value and parameters (if any completed trials exist)
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0).name("x");
let mut trial = study.create_trial();
let _ = x.suggest(&mut trial).unwrap();
study.complete_trial(trial, 0.42);
let summary = study.summary();
assert!(summary.contains("Minimize"));
assert!(summary.contains("0.42"));Source§impl Study<f64>
impl Study<f64>
Sourcepub fn export_html(&self, path: impl AsRef<Path>) -> Result<()>
pub fn export_html(&self, path: impl AsRef<Path>) -> Result<()>
Generate an HTML report with interactive Plotly.js charts.
Create a self-contained HTML file that can be opened in any browser.
See generate_html_report
for details on the included charts.
§Errors
Returns an I/O error if the file cannot be created or written.
Source§impl<V> Study<V>where
V: PartialOrd + Clone,
impl<V> Study<V>where
V: PartialOrd + Clone,
Sourcepub fn iter(&self) -> IntoIter<CompletedTrial<V>>
pub fn iter(&self) -> IntoIter<CompletedTrial<V>>
Return an iterator over all completed trials.
This clones the internal trial list, so it is suitable for analysis and iteration but not for hot paths.
§Examples
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let trial = study.create_trial();
study.complete_trial(trial, 1.0);
for t in study.iter() {
println!("Trial {} → {}", t.id, t.value);
}Source§impl<V> Study<V>where
V: PartialOrd,
impl<V> Study<V>where
V: PartialOrd,
Sourcepub fn optimize(
&self,
n_trials: usize,
objective: impl Objective<V>,
) -> Result<()>
pub fn optimize( &self, n_trials: usize, objective: impl Objective<V>, ) -> Result<()>
Run optimization with an objective.
Accepts any Objective implementation, including
plain closures (Fn(&mut Trial) -> Result<V, E>) thanks to the
blanket impl. Struct-based objectives can override
before_trial and
after_trial for early stopping.
Runs up to n_trials evaluations sequentially.
§Errors
Returns Error::NoCompletedTrials if no trials completed successfully.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::sampler::random::RandomSampler;
use optimizer::{Direction, Study};
let sampler = RandomSampler::with_seed(42);
let study: Study<f64> = Study::with_sampler(Direction::Minimize, sampler);
let x_param = FloatParam::new(-10.0, 10.0);
study
.optimize(10, |trial: &mut optimizer::Trial| {
let x = x_param.suggest(trial)?;
Ok::<_, optimizer::Error>(x * x)
})
.unwrap();
assert!(study.n_trials() > 0);
assert!(study.best_value().unwrap() >= 0.0);Source§impl<V> Study<V>where
V: PartialOrd,
impl<V> Study<V>where
V: PartialOrd,
Sourcepub fn new(direction: Direction) -> Self
pub fn new(direction: Direction) -> Self
Create a new study with the given optimization direction.
Uses the default RandomSampler for parameter sampling.
§Arguments
direction- Whether to minimize or maximize the objective function.
§Examples
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
assert_eq!(study.direction(), Direction::Minimize);Sourcepub fn builder() -> StudyBuilder<V>
pub fn builder() -> StudyBuilder<V>
Return a StudyBuilder for constructing a study with a fluent API.
§Examples
use optimizer::prelude::*;
let study: Study<f64> = Study::builder()
.minimize()
.sampler(TpeSampler::new())
.pruner(NopPruner)
.build();Sourcepub fn minimize(sampler: impl Sampler + 'static) -> Self
pub fn minimize(sampler: impl Sampler + 'static) -> Self
Create a study that minimizes the objective value.
This is a shorthand for Study::with_sampler(Direction::Minimize, sampler).
§Arguments
sampler- The sampler to use for parameter sampling.
§Examples
use optimizer::Study;
use optimizer::sampler::tpe::TpeSampler;
let study: Study<f64> = Study::minimize(TpeSampler::new());
assert_eq!(study.direction(), optimizer::Direction::Minimize);Sourcepub fn maximize(sampler: impl Sampler + 'static) -> Self
pub fn maximize(sampler: impl Sampler + 'static) -> Self
Create a study that maximizes the objective value.
This is a shorthand for Study::with_sampler(Direction::Maximize, sampler).
§Arguments
sampler- The sampler to use for parameter sampling.
§Examples
use optimizer::Study;
use optimizer::sampler::tpe::TpeSampler;
let study: Study<f64> = Study::maximize(TpeSampler::new());
assert_eq!(study.direction(), optimizer::Direction::Maximize);Sourcepub fn with_sampler(
direction: Direction,
sampler: impl Sampler + 'static,
) -> Self
pub fn with_sampler( direction: Direction, sampler: impl Sampler + 'static, ) -> Self
Create a new study with a custom sampler.
§Arguments
direction- Whether to minimize or maximize the objective function.sampler- The sampler to use for parameter sampling.
§Examples
use optimizer::sampler::random::RandomSampler;
use optimizer::{Direction, Study};
let sampler = RandomSampler::with_seed(42);
let study: Study<f64> = Study::with_sampler(Direction::Maximize, sampler);
assert_eq!(study.direction(), Direction::Maximize);Sourcepub fn with_sampler_and_storage(
direction: Direction,
sampler: impl Sampler + 'static,
storage: impl Storage<V> + 'static,
) -> Selfwhere
V: 'static,
pub fn with_sampler_and_storage(
direction: Direction,
sampler: impl Sampler + 'static,
storage: impl Storage<V> + 'static,
) -> Selfwhere
V: 'static,
Create a study with a custom sampler and storage backend.
This is the most general constructor — all other constructors
delegate to this one. Use it when you need a non-default storage
backend (e.g., JournalStorage).
§Arguments
direction- Whether to minimize or maximize the objective function.sampler- The sampler to use for parameter sampling.storage- The storage backend for completed trials.
§Examples
use optimizer::sampler::random::RandomSampler;
use optimizer::storage::MemoryStorage;
use optimizer::{Direction, Study};
let storage = MemoryStorage::<f64>::new();
let study = Study::with_sampler_and_storage(Direction::Minimize, RandomSampler::new(), storage);Sourcepub fn with_sampler_and_pruner(
direction: Direction,
sampler: impl Sampler + 'static,
pruner: impl Pruner + 'static,
) -> Self
pub fn with_sampler_and_pruner( direction: Direction, sampler: impl Sampler + 'static, pruner: impl Pruner + 'static, ) -> Self
Creates a study with a custom sampler and pruner.
Uses the default MemoryStorage backend.
§Arguments
direction- Whether to minimize or maximize the objective function.sampler- The sampler to use for parameter sampling.pruner- The pruner to use for trial pruning.
§Examples
use optimizer::pruner::NopPruner;
use optimizer::sampler::random::RandomSampler;
use optimizer::{Direction, Study};
let sampler = RandomSampler::with_seed(42);
let study: Study<f64> = Study::with_sampler_and_pruner(Direction::Minimize, sampler, NopPruner);Sourcepub fn set_sampler(&mut self, sampler: impl Sampler + 'static)where
V: 'static,
pub fn set_sampler(&mut self, sampler: impl Sampler + 'static)where
V: 'static,
Replace the sampler used for future parameter suggestions.
The new sampler takes effect for all subsequent calls to
create_trial, ask, and the
optimize* family. Already-completed trials are unaffected.
§Examples
use optimizer::sampler::tpe::TpeSampler;
use optimizer::{Direction, Study};
let mut study: Study<f64> = Study::new(Direction::Minimize);
study.set_sampler(TpeSampler::new());Sourcepub fn set_pruner(&mut self, pruner: impl Pruner + 'static)where
V: 'static,
pub fn set_pruner(&mut self, pruner: impl Pruner + 'static)where
V: 'static,
Replace the pruner used for future trials.
The new pruner takes effect for all trials created after this call.
§Examples
use optimizer::prelude::*;
let mut study: Study<f64> = Study::new(Direction::Minimize);
study.set_pruner(MedianPruner::new(Direction::Minimize));Sourcepub fn enqueue(&self, params: HashMap<ParamId, ParamValue>)
pub fn enqueue(&self, params: HashMap<ParamId, ParamValue>)
Enqueue a specific parameter configuration to be evaluated next.
The next call to ask() or the next trial in optimize()
will use these exact parameters instead of sampling from the sampler.
Multiple configurations can be enqueued; they are evaluated in FIFO order.
If an enqueued configuration is missing a parameter that the objective calls
suggest() on, that parameter falls back to normal sampling.
§Arguments
params- A map from parameter IDs to the values to use.
§Examples
use std::collections::HashMap;
use optimizer::parameter::{FloatParam, IntParam, ParamValue, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0);
let y = IntParam::new(1, 100);
// Evaluate these specific configurations first
study.enqueue(HashMap::from([
(x.id(), ParamValue::Float(0.001)),
(y.id(), ParamValue::Int(3)),
]));
// Next trial will use x=0.001, y=3
let mut trial = study.ask();
assert_eq!(x.suggest(&mut trial).unwrap(), 0.001);
assert_eq!(y.suggest(&mut trial).unwrap(), 3);Sourcepub fn n_enqueued(&self) -> usize
pub fn n_enqueued(&self) -> usize
Return the number of enqueued parameter configurations.
See enqueue for how to add configurations.
Sourcepub fn create_trial(&self) -> Trial
pub fn create_trial(&self) -> Trial
Create a new trial with a unique ID.
The trial starts in the Running state and can be used to suggest
parameter values. After the objective function is evaluated, call
complete_trial or fail_trial to record the result.
For Study<f64>, this method automatically integrates with the study’s
sampler and trial history, so there is no need to call a separate
create_trial_with_sampler() method.
§Examples
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let trial = study.create_trial();
assert_eq!(trial.id(), 0);
let trial2 = study.create_trial();
assert_eq!(trial2.id(), 1);Sourcepub fn complete_trial(&self, trial: Trial, value: V)
pub fn complete_trial(&self, trial: Trial, value: V)
Record a completed trial with its objective value.
This method stores the trial’s parameters, distributions, and objective value in the study’s history. The stored data is used by samplers to inform future parameter suggestions.
§Arguments
trial- The trial that was evaluated.value- The objective value returned by the objective function.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x_param = FloatParam::new(0.0, 1.0);
let mut trial = study.create_trial();
let x = x_param.suggest(&mut trial).unwrap();
let objective_value = x * x;
study.complete_trial(trial, objective_value);
assert_eq!(study.n_trials(), 1);Sourcepub fn fail_trial(&self, trial: Trial, _error: impl ToString)
pub fn fail_trial(&self, trial: Trial, _error: impl ToString)
Record a failed trial with an error message.
Failed trials are not stored in the study’s history and do not contribute to future sampling decisions. This method is useful when the objective function raises an error that should not stop the optimization process.
§Arguments
trial- The trial that failed._error- An error message describing why the trial failed.
§Examples
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let trial = study.create_trial();
study.fail_trial(trial, "objective function raised an exception");
// Failed trials are not counted
assert_eq!(study.n_trials(), 0);Sourcepub fn ask(&self) -> Trial
pub fn ask(&self) -> Trial
Request a new trial with suggested parameters.
This is the first half of the ask-and-tell interface. After calling
ask(), use parameter types to suggest values on the returned trial,
evaluate your objective externally, then pass the trial back to
tell() with the result.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x = FloatParam::new(0.0, 10.0);
let mut trial = study.ask();
let x_val = x.suggest(&mut trial).unwrap();
let value = x_val * x_val;
study.tell(trial, Ok::<_, &str>(value));Sourcepub fn tell(&self, trial: Trial, value: Result<V, impl ToString>)
pub fn tell(&self, trial: Trial, value: Result<V, impl ToString>)
Report the result of a trial obtained from ask().
Pass Ok(value) for a successful evaluation or Err(reason) for a
failure. Failed trials are not stored in the study’s history.
§Examples
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let trial = study.ask();
study.tell(trial, Ok::<_, &str>(42.0));
assert_eq!(study.n_trials(), 1);
let trial = study.ask();
study.tell(trial, Err::<f64, _>("evaluation failed"));
assert_eq!(study.n_trials(), 1); // failed trials not countedSourcepub fn prune_trial(&self, trial: Trial)where
V: Default,
pub fn prune_trial(&self, trial: Trial)where
V: Default,
Record a pruned trial, preserving its intermediate values.
Pruned trials are stored alongside completed trials so that samplers
can optionally learn from partial evaluations. The trial’s state is
set to Pruned.
In practice you rarely call this directly — returning
Err(TrialPruned) from an objective function handles pruning
automatically.
§Arguments
trial- The trial that was pruned.
Sourcepub fn trials(&self) -> Vec<CompletedTrial<V>>where
V: Clone,
pub fn trials(&self) -> Vec<CompletedTrial<V>>where
V: Clone,
Return all completed trials as a Vec.
The returned vector contains clones of CompletedTrial values, which contain
the trial’s parameters, distributions, and objective value.
Note: This method acquires a read lock on the completed trials, so the returned vector is a clone of the internal storage.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
let x_param = FloatParam::new(0.0, 1.0);
let mut trial = study.create_trial();
let _ = x_param.suggest(&mut trial);
study.complete_trial(trial, 0.5);
for completed in study.trials() {
println!("Trial {} has value {:?}", completed.id, completed.value);
}Sourcepub fn n_trials(&self) -> usize
pub fn n_trials(&self) -> usize
Return the number of completed trials.
Pruned and failed trials are not counted. Use
n_pruned_trials() for the pruned count.
§Examples
use optimizer::parameter::{FloatParam, Parameter};
use optimizer::{Direction, Study};
let study: Study<f64> = Study::new(Direction::Minimize);
assert_eq!(study.n_trials(), 0);
let x_param = FloatParam::new(0.0, 1.0);
let mut trial = study.create_trial();
let _ = x_param.suggest(&mut trial);
study.complete_trial(trial, 0.5);
assert_eq!(study.n_trials(), 1);Sourcepub fn n_pruned_trials(&self) -> usize
pub fn n_pruned_trials(&self) -> usize
Return the number of pruned trials.
Pruned trials are those that were stopped early by the pruner.
Source§impl<V: PartialOrd + Send + Sync + 'static> Study<V>
impl<V: PartialOrd + Send + Sync + 'static> Study<V>
Sourcepub fn with_sampler_pruner_and_storage(
direction: Direction,
sampler: impl Sampler + 'static,
pruner: impl Pruner + 'static,
storage: impl Storage<V> + 'static,
) -> Self
pub fn with_sampler_pruner_and_storage( direction: Direction, sampler: impl Sampler + 'static, pruner: impl Pruner + 'static, storage: impl Storage<V> + 'static, ) -> Self
Create a study with a custom sampler, pruner, and storage backend.
The most flexible constructor, allowing full control over all components.
§Arguments
direction- Whether to minimize or maximize the objective function.sampler- The sampler to use for parameter sampling.pruner- The pruner to use for trial pruning.storage- The storage backend for completed trials.
§Examples
use optimizer::prelude::*;
use optimizer::storage::MemoryStorage;
let study = Study::with_sampler_pruner_and_storage(
Direction::Minimize,
TpeSampler::new(),
MedianPruner::new(Direction::Minimize),
MemoryStorage::<f64>::new(),
);