Struct linfa_trees::DecisionTree
source · [−]Expand description
A fitted decision tree model for classification.
Structure
A decision tree structure is a binary tree where:
-
Each internal node specifies a decision, represented by a choice of a feature and a “split value” such that all observations for which
feature <= split_valueis true fall in the left subtree, while the others fall in the right subtree. -
leaf nodes make predictions, and their prediction is the most popular label in the node
Algorithm
Starting with a single root node, decision trees are trained recursively by applying the following rule to every node considered:
- Find the best split value for each feature of the observations belonging in the node;
- Select the feature (and its best split value) that maximizes the quality of the split;
- If the score of the split is sufficiently larger than the score of the unsplit node, then two child nodes are generated, the left one
containing all observations with
feature <= split valueand the right one containing the rest. - If no suitable split is found, the node is marked as leaf and its prediction is set to be the most common label in the node;
The quality score used can be specified in the parameters.
Predictions
To predict the label of a sample, the tree is traversed from the root to a leaf, choosing between left and right children according to the values of the features of the sample. The final prediction for the sample is the prediction of the reached leaf.
Additional constraints
In order to avoid overfitting the training data, some additional constraints on the quality/quantity of splits can be added to the tree. A description of these additional rules is provided in the parameters page.
Example
Here is an example on how to train a decision tree from its parameters:
use linfa_trees::DecisionTree;
use linfa::prelude::*;
use linfa_datasets;
// Load the dataset
let dataset = linfa_datasets::iris();
// Fit the tree
let tree = DecisionTree::params().fit(&dataset).unwrap();
// Get accuracy on training set
let accuracy = tree.predict(&dataset).confusion_matrix(&dataset).unwrap().accuracy();
assert!(accuracy > 0.9);
Implementations
sourceimpl<F: Float, L: Label> DecisionTree<F, L>
impl<F: Float, L: Label> DecisionTree<F, L>
sourcepub fn iter_nodes(&self) -> NodeIter<'_, F, L>ⓘNotable traits for NodeIter<'a, F, L>impl<'a, F: Float, L: Debug + Label> Iterator for NodeIter<'a, F, L> type Item = &'a TreeNode<F, L>;
pub fn iter_nodes(&self) -> NodeIter<'_, F, L>ⓘNotable traits for NodeIter<'a, F, L>impl<'a, F: Float, L: Debug + Label> Iterator for NodeIter<'a, F, L> type Item = &'a TreeNode<F, L>;
Create a node iterator in level-order (BFT)
sourcepub fn mean_impurity_decrease(&self) -> Vec<F>
pub fn mean_impurity_decrease(&self) -> Vec<F>
Return the mean impurity decrease for each feature
sourcepub fn relative_impurity_decrease(&self) -> Vec<F>
pub fn relative_impurity_decrease(&self) -> Vec<F>
Return the relative impurity decrease for each feature
sourcepub fn feature_importance(&self) -> Vec<F>
pub fn feature_importance(&self) -> Vec<F>
Return the feature importance, i.e. the relative impurity decrease, for each feature
sourcepub fn num_leaves(&self) -> usize
pub fn num_leaves(&self) -> usize
Return the number of leaves in this tree
sourcepub fn export_to_tikz(&self) -> Tikz<'_, F, L>
pub fn export_to_tikz(&self) -> Tikz<'_, F, L>
Generates a Tikz structure to print the
fitted tree in Tex using tikz and forest, with the following default parameters:
legend=falsecomplete=true
sourceimpl<F: Float, L: Label> DecisionTree<F, L>
impl<F: Float, L: Label> DecisionTree<F, L>
sourcepub fn params() -> DecisionTreeParams<F, L>
pub fn params() -> DecisionTreeParams<F, L>
Defaults are provided if the optional parameters are not specified:
split_quality = SplitQuality::Ginimax_depth = Nonemin_weight_split = 2.0min_weight_leaf = 1.0min_impurity_decrease = 0.00001
Trait Implementations
sourceimpl<F: Float, L: Label + Default, D: Data<Elem = F>> PredictInplace<ArrayBase<D, Dim<[usize; 2]>>, ArrayBase<OwnedRepr<L>, Dim<[usize; 1]>>> for DecisionTree<F, L>
impl<F: Float, L: Label + Default, D: Data<Elem = F>> PredictInplace<ArrayBase<D, Dim<[usize; 2]>>, ArrayBase<OwnedRepr<L>, Dim<[usize; 1]>>> for DecisionTree<F, L>
sourcefn predict_inplace(&self, x: &ArrayBase<D, Ix2>, y: &mut Array1<L>)
fn predict_inplace(&self, x: &ArrayBase<D, Ix2>, y: &mut Array1<L>)
Make predictions for each row of a matrix of features x.
sourcefn default_target(&self, x: &ArrayBase<D, Ix2>) -> Array1<L>
fn default_target(&self, x: &ArrayBase<D, Ix2>) -> Array1<L>
Create targets that predict_inplace works with.
Auto Trait Implementations
impl<F, L> RefUnwindSafe for DecisionTree<F, L> where
F: RefUnwindSafe,
L: RefUnwindSafe,
impl<F, L> Send for DecisionTree<F, L> where
L: Send,
impl<F, L> Sync for DecisionTree<F, L> where
L: Sync,
impl<F, L> Unpin for DecisionTree<F, L> where
F: Unpin,
L: Unpin,
impl<F, L> UnwindSafe for DecisionTree<F, L> where
F: UnwindSafe,
L: UnwindSafe,
Blanket Implementations
sourceimpl<T> BorrowMut<T> for T where
T: ?Sized,
impl<T> BorrowMut<T> for T where
T: ?Sized,
const: unstable · sourcepub fn borrow_mut(&mut self) -> &mut T
pub fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more