Expand description
Tree building algorithms and utilities
This module contains algorithms for building decision trees, including split finding, impurity calculations, and feature grouping utilities.
Structs§
- Best
First Tree Builder - Best-first tree builder
Functions§
- apply_
auto_ correlation_ grouping - Apply automatic correlation-based feature grouping
- apply_
feature_ grouping - Apply feature grouping to training data
- apply_
hierarchical_ grouping - Apply hierarchical clustering-based feature grouping
- apply_
manual_ grouping - Apply manual feature grouping specified by user
- calculate_
correlation_ matrix - Calculate correlation matrix for features
- calculate_
pearson_ correlation - Calculate Pearson correlation between two feature vectors
- create_
reduced_ feature_ matrix - Create reduced feature matrix with only representative features
- find_
best_ logloss_ split - Find best split using Log-loss criterion for classification
- find_
best_ mae_ split - Find best split using MAE criterion for regression
- find_
best_ split_ for_ node - Find best split for a node given sample indices
- find_
best_ twoing_ split - Find best split using Twoing criterion for classification
- gini_
impurity - Calculate gini impurity for multiway splits
- handle_
missing_ values - Handle missing values in the data based on the specified strategy
- hierarchical_
clustering - Simple hierarchical clustering implementation
- log_
loss_ impurity - Calculate Log-loss impurity for probability-based classification
- mae_
impurity - Calculate Mean Absolute Error (MAE) impurity for regression
- select_
group_ representative - Select representative feature from a group
- split_
samples_ by_ threshold - Split samples by threshold
- twoing_
impurity - Calculate Twoing criterion impurity for binary classification