Skip to main content

Module ensemble

Module ensemble 

Source
Expand description

SGBT ensemble orchestrator — the core boosting loop.

Implements Streaming Gradient Boosted Trees (Gunasekara et al., 2024): a sequence of boosting steps, each owning a streaming tree and drift detector, with automatic tree replacement when concept drift is detected.

§Algorithm

For each incoming sample (x, y):

  1. Compute the current ensemble prediction: F(x) = base + lr * Σ tree_s(x)
  2. For each boosting step s = 1..N:
    • Compute gradient g = loss.gradient(y, current_pred)
    • Compute hessian h = loss.hessian(y, current_pred)
    • Feed (x, g, h) to tree s (which internally uses weighted squared loss)
    • Update current_pred += lr * tree_s.predict(x)
  3. The ensemble adapts incrementally, with each tree targeting the residual of all preceding trees.

Modules§

bagged
Bagged SGBT ensemble using Oza online bagging (Poisson weighting).
config
SGBT configuration with builder pattern and full validation.
multi_target
Multi-target regression via parallel SGBT ensembles.
multiclass
Multi-class classification via one-vs-rest SGBT committees.
parallel
Parallel SGBT training with delayed gradient updates.
quantile_regressor
Non-crossing multi-quantile regression via parallel SGBT ensembles.
replacement
TreeSlot: warning/danger/swap lifecycle for tree replacement.
step
Single boosting step: owns one tree + drift detector + optional alternate.
variants
SGBT computational variants (Gunasekara et al., 2024).

Structs§

SGBT
Streaming Gradient Boosted Trees ensemble.

Type Aliases§

DynSGBT
Type alias for an SGBT model using dynamic (boxed) loss dispatch.