optimizer
Bayesian and population-based optimization library with an Optuna-like API for hyperparameter tuning and black-box optimization. Supports 12 samplers, 8 pruners, multi-objective optimization, async parallelism, and persistent storage.
Quick Start
use *;
let study: = new;
let x = new.name;
study.optimize.unwrap;
let best = study.best_trial.unwrap;
println!;
Features at a Glance
- Samplers — Random, TPE, Multivariate TPE, Grid, Sobol, CMA-ES, Gaussian Process, Differential Evolution, BOHB, NSGA-II, NSGA-III, MOEA/D
- Pruners — Median, Percentile, Threshold, Patient, Hyperband, Successive Halving, Wilcoxon, Nop
- Parameters — Float, Int, Categorical, Bool, and Enum types with
.name()labels and typed access - Multi-objective — Pareto front extraction with NSGA-II/III and MOEA/D
- Async & parallel — Concurrent trial evaluation with Tokio
- Storage backends — In-memory (default) or JSONL journal for persistence and resumption
- Visualization — HTML reports with optimization history and parameter importance
- Analysis — fANOVA and Spearman correlation for parameter importance
Feature Flags
| Flag | Enables | Default |
|---|---|---|
async |
Async/parallel optimization (Tokio) | No |
derive |
#[derive(Categorical)] for enum parameters |
No |
serde |
Serialization of trials and parameters | No |
journal |
JSONL storage backend (implies serde) |
No |
sobol |
Sobol quasi-random sampler | No |
cma-es |
CMA-ES sampler (requires nalgebra) |
No |
gp |
Gaussian Process sampler (requires nalgebra) |
No |
tracing |
Structured logging with tracing |
No |
Examples
Learn More
License
MIT