Skip to main content

Module ensemble

Module ensemble 

Source
Available on crate feature alloc only.
Expand description

SGBT ensemble orchestrator – the core boosting loop.

Implements Streaming Gradient Boosted Trees (Gunasekara et al., 2024): a sequence of boosting steps, each owning a streaming tree and drift detector, with automatic tree replacement when concept drift is detected.

§Algorithm

For each incoming sample (x, y):

  1. Compute the current ensemble prediction: F(x) = base + lr * Σ tree_s(x)
  2. For each boosting step s = 1..N:
    • Compute gradient g = loss.gradient(y, current_pred)
    • Compute hessian h = loss.hessian(y, current_pred)
    • Feed (x, g, h) to tree s (which internally uses weighted squared loss)
    • Update current_pred += lr * tree_s.predict(x)
  3. The ensemble adapts incrementally, with each tree targeting the residual of all preceding trees.

Re-exports§

pub use core::SGBT;

Modules§

adaptive
Adaptive learning rate wrapper for SGBT ensembles.
adaptive_forest
Adaptive Random Forest (ARF) for streaming classification.
bagged
Bagged SGBT ensemble using Oza online bagging (Poisson weighting).
config
SGBT configuration with builder pattern and full validation.
core
SGBT core: struct definition, Clone/Debug, and constructors.
diagnostics
Diagnostics for SGBT ensembles.
distributional
Distributional SGBT – outputs Gaussian N(μ, σ²) instead of a point estimate.
lr_schedule
Learning rate scheduling for streaming gradient boosted trees.
moe
Streaming Mixture of Experts over SGBT ensembles.
moe_distributional
Streaming Mixture of Experts over Distributional SGBT ensembles with shadow expert competition.
multi_target
Multi-target regression via parallel SGBT ensembles.
multiclass
Multi-class classification via one-vs-rest SGBT committees.
parallelparallel
Parallel SGBT training with delayed gradient updates.
quantile_regressor
Non-crossing multi-quantile regression via parallel SGBT ensembles.
replacement
TreeSlot: warning/danger/swap lifecycle for tree replacement.
stacked
Polymorphic model stacking meta-learner for streaming ensembles.
step
Single boosting step: owns one tree + drift detector + optional alternate.
variants
SGBT computational variants (Gunasekara et al., 2024).

Type Aliases§

DynSGBT
Type alias for an SGBT model using dynamic (boxed) loss dispatch.