pub trait LeafModel: Send + Sync {
// Required methods
fn predict(&self, features: &[f64]) -> f64;
fn update(
&mut self,
features: &[f64],
gradient: f64,
hessian: f64,
lambda: f64,
);
fn clone_fresh(&self) -> Box<dyn LeafModel>;
// Provided method
fn clone_warm(&self) -> Box<dyn LeafModel> { ... }
}Available on crate feature
alloc only.Expand description
A trainable prediction model that lives inside a decision tree leaf.
Implementations must be Send + Sync so trees can be shared across threads.
Required Methods§
Sourcefn update(&mut self, features: &[f64], gradient: f64, hessian: f64, lambda: f64)
fn update(&mut self, features: &[f64], gradient: f64, hessian: f64, lambda: f64)
Update model parameters given a gradient, hessian, and regularization lambda.
Sourcefn clone_fresh(&self) -> Box<dyn LeafModel>
fn clone_fresh(&self) -> Box<dyn LeafModel>
Create a fresh (zeroed / re-initialized) clone of this model’s architecture.
Provided Methods§
Sourcefn clone_warm(&self) -> Box<dyn LeafModel>
fn clone_warm(&self) -> Box<dyn LeafModel>
Create a warm clone preserving learned weights but resetting optimizer state.
Used when splitting a leaf: child leaves inherit the parent’s learned
function as a starting point, converging faster than starting from scratch.
Defaults to clone_fresh for models where
warm-starting is not meaningful (e.g. ClosedFormLeaf).