pub struct Optimizer<KInit, P, M> {
pub cutoff: f64,
pub n_candidates: usize,
pub bandwidth: P,
/* private fields */
}Expand description
✨ Hyperparameter optimizer.
§Generic parameters
- [
KInit]: kernel type of the initial (prior) estimator component - [
P]: type of parameter that is optimized - [
M]: value of the target function, the less – the better
Fields§
§cutoff: f64§n_candidates: usize§bandwidth: PImplementations§
Source§impl<KInit, P, M> Optimizer<KInit, P, M>
impl<KInit, P, M> Optimizer<KInit, P, M>
Sourcepub fn new(range: RangeInclusive<P>, init_kernel: KInit, rng: Rng) -> Selfwhere
P: One,
pub fn new(range: RangeInclusive<P>, init_kernel: KInit, rng: Rng) -> Selfwhere
P: One,
Sourcepub fn n_candidates(self, n_candidates: impl Into<usize>) -> Self
pub fn n_candidates(self, n_candidates: impl Into<usize>) -> Self
Set the number of candidates to choose the next trial from the acquisition function1.
Sampling from the acquisition function is cheaper than evaluating the target cost function, so the more candidates – the more effective is the optimization step.
However, the acquisition function is an approximation of a potential gain, so the fewer candidates – the more precise is the optimization step.
The number of candidates is therefore a tradeoff.
Acquisition function is basically a ratio between the «good» KDE and «bad» KDE at the same point. ↩
Sourcepub fn bandwidth(self, bandwidth: impl Into<P>) -> Self
pub fn bandwidth(self, bandwidth: impl Into<P>) -> Self
Set the bandwidth multiplier for the estimator kernels.
Standard deviation of the kernel is the distance from the point to its furthest neighbour, multiplied by this coefficient.
The default multiplier is [P::one]. Lower bandwidth approximates the density better,
however, is also prone to over-fitting. Higher bandwidth avoid over-fitting better,
but is also smoother and less precise.
Sourcepub fn feed_back(&mut self, parameter: P, metric: M)
pub fn feed_back(&mut self, parameter: P, metric: M)
Provide the information about the trial, or in other words, «fit» the optimizer on the sample.
Normally, you’ll call your target function on parameters supplied by Optimizer::new_trial,
and feed back the results. But you also can feed it with any arbitrary parameters.
§Parameters
parameter: the target function parametermetric: the target function metric
Sourcepub fn new_trial<K>(&mut self) -> P
pub fn new_trial<K>(&mut self) -> P
Generate a parameter value for a new trial.
After evaluating the target function with this parameter,
you’d better feed the metric back with Optimizer::feed_back.
§Type parameters
- [
K]: kernel type - [
D]: kernel density type
§Panics
This method may panic if a random or calculated number cannot be converted to the parameter or density type.
Sourcepub fn best_trial(&self) -> Option<&Trial<P, M>>
pub fn best_trial(&self) -> Option<&Trial<P, M>>
Get the best trial.