ferrolearn_tree/lib.rs
1//! # ferrolearn-tree
2//!
3//! Decision tree and ensemble tree models for the ferrolearn machine learning framework.
4//!
5//! This crate provides implementations of:
6//!
7//! - **[`DecisionTreeClassifier`]** / **[`DecisionTreeRegressor`]** — CART decision trees
8//! with configurable splitting criteria, depth limits, and minimum sample constraints.
9//! - **[`RandomForestClassifier`]** / **[`RandomForestRegressor`]** — Bootstrap-aggregated
10//! ensembles of decision trees with random feature subsets, built in parallel via `rayon`.
11//! - **[`GradientBoostingClassifier`]** / **[`GradientBoostingRegressor`]** — Gradient boosting
12//! ensembles that sequentially fit trees to the negative gradient of a loss function.
13//! - **[`HistGradientBoostingClassifier`]** / **[`HistGradientBoostingRegressor`]** —
14//! Histogram-based gradient boosting with O(n_bins) split finding, subtraction trick,
15//! native NaN support, and optional best-first (leaf-wise) growth.
16//! - **[`AdaBoostClassifier`]** — Adaptive Boosting using decision tree stumps with
17//! SAMME and SAMME.R algorithms.
18//! - **[`ExtraTreeClassifier`]** / **[`ExtraTreeRegressor`]** — Extremely randomized
19//! trees where split thresholds are chosen randomly rather than via exhaustive search.
20//! - **[`ExtraTreesClassifier`]** / **[`ExtraTreesRegressor`]** — Ensembles of
21//! extremely randomized trees with Rayon parallel fitting. No bootstrap by default.
22//! - **[`IsolationForest`]** — Anomaly detection via random isolation trees.
23//! - **[`VotingClassifier`]** / **[`VotingRegressor`]** — Ensembles of decision trees
24//! with varying hyperparameters, aggregated by majority vote or averaging.
25//! - **[`RandomTreesEmbedding`]** — Unsupervised feature transformation via one-hot
26//! encoded leaf indices across an ensemble of randomly built trees.
27//!
28//! # Design
29//!
30//! Each model follows the compile-time safety pattern:
31//!
32//! - The unfitted struct (e.g., `DecisionTreeClassifier<F>`) holds hyperparameters
33//! and implements [`Fit`](ferrolearn_core::Fit).
34//! - Calling `fit()` produces a new fitted type (e.g., `FittedDecisionTreeClassifier<F>`)
35//! that implements [`Predict`](ferrolearn_core::Predict).
36//! - Calling `predict()` on an unfitted model is a compile-time error.
37//!
38//! # Pipeline Integration
39//!
40//! All models implement [`PipelineEstimator`](ferrolearn_core::pipeline::PipelineEstimator)
41//! for `f64`, allowing them to be used as the final step in a
42//! [`Pipeline`](ferrolearn_core::pipeline::Pipeline).
43//!
44//! # Float Generics
45//!
46//! All models are generic over `F: num_traits::Float + Send + Sync + 'static`,
47//! supporting both `f32` and `f64`.
48
49pub mod adaboost;
50pub mod decision_tree;
51pub mod extra_tree;
52pub mod extra_trees_ensemble;
53pub mod gradient_boosting;
54pub mod hist_gradient_boosting;
55pub mod isolation_forest;
56pub mod random_forest;
57pub mod random_trees_embedding;
58pub mod voting;
59
60// Re-export the main types at the crate root.
61pub use adaboost::{AdaBoostAlgorithm, AdaBoostClassifier, FittedAdaBoostClassifier};
62pub use decision_tree::{
63 ClassificationCriterion, DecisionTreeClassifier, DecisionTreeRegressor,
64 FittedDecisionTreeClassifier, FittedDecisionTreeRegressor, Node, RegressionCriterion,
65};
66pub use extra_tree::{
67 ExtraTreeClassifier, ExtraTreeRegressor, FittedExtraTreeClassifier, FittedExtraTreeRegressor,
68};
69pub use extra_trees_ensemble::{
70 ExtraTreesClassifier, ExtraTreesRegressor, FittedExtraTreesClassifier,
71 FittedExtraTreesRegressor,
72};
73pub use gradient_boosting::{
74 ClassificationLoss, FittedGradientBoostingClassifier, FittedGradientBoostingRegressor,
75 GradientBoostingClassifier, GradientBoostingRegressor, RegressionLoss,
76};
77pub use hist_gradient_boosting::{
78 FittedHistGradientBoostingClassifier, FittedHistGradientBoostingRegressor,
79 HistClassificationLoss, HistGradientBoostingClassifier, HistGradientBoostingRegressor,
80 HistNode, HistRegressionLoss,
81};
82pub use isolation_forest::{FittedIsolationForest, IsolationForest};
83pub use random_forest::{
84 FittedRandomForestClassifier, FittedRandomForestRegressor, MaxFeatures, RandomForestClassifier,
85 RandomForestRegressor,
86};
87pub use random_trees_embedding::{FittedRandomTreesEmbedding, RandomTreesEmbedding};
88pub use voting::{
89 FittedVotingClassifier, FittedVotingRegressor, VotingClassifier, VotingRegressor,
90};