Expand description
Advanced ML models: Random Forest, Gradient Boosting, and cross-validation.
Pure-Rust implementations for molecular property prediction beyond simple linear models. Includes:
- Random Forest (bagged decision trees)
- Gradient Boosted Trees (GBM / GBRT)
- K-fold cross-validation
- Model recalibration via isotonic regression
Structs§
- Cross
Validation Result - Result of k-fold cross-validation.
- Gradient
Boosting - Gradient Boosted Trees regressor.
- Gradient
Boosting Config - Configuration for Gradient Boosting.
- Isotonic
Calibrator - Isotonic regression for model recalibration. Fits a monotone non-decreasing function to prediction-target pairs.
- Random
Forest - Random Forest model: ensemble of bagged decision trees.
- Random
Forest Config - Configuration for Random Forest.
- Tree
Config - Configuration for tree building.
Enums§
Functions§
- build_
tree - Build a decision tree from data.
- cross_
validate - Perform k-fold cross-validation.
- train_
gradient_ boosting - Train a Gradient Boosting regressor (L2 loss / least squares).
- train_
random_ forest - Train a Random Forest regressor.