Struct linfa_trees::DecisionTree[][src]

pub struct DecisionTree<F: Float, L: Label> { /* fields omitted */ }

A fitted decision tree model for classification.

Structure

A decision tree structure is a binary tree where:

  • Each internal node specifies a decision, represented by a choice of a feature and a “split value” such that all observations for which feature <= split_value is true fall in the left subtree, while the others fall in the right subtree.

  • leaf nodes make predictions, and their prediction is the most popular label in the node

Algorithm

Starting with a single root node, decision trees are trained recursively by applying the following rule to every node considered:

  • Find the best split value for each feature of the observations belonging in the node;
  • Select the feature (and its best split value) that maximizes the quality of the split;
  • If the score of the split is sufficiently larger than the score of the unsplit node, then two child nodes are generated, the left one containing all observations with feature <= split value and the right one containing the rest.
  • If no suitable split is found, the node is marked as leaf and its prediction is set to be the most common label in the node;

The quality score used can be specified in the parameters.

Predictions

To predict the label of a sample, the tree is traversed from the root to a leaf, choosing between left and right children according to the values of the features of the sample. The final prediction for the sample is the prediction of the reached leaf.

Additional constraints

In order to avoid overfitting the training data, some additional constraints on the quality/quantity of splits can be added to the tree. A description of these additional rules is provided in the parameters page.

Example

Here is an example on how to train a decision tree from its parameters:


use linfa_trees::DecisionTree;
use linfa::prelude::*;
use linfa_datasets;

// Load the dataset
let dataset = linfa_datasets::iris();
// Fit the tree
let tree = DecisionTree::params().fit(&dataset).unwrap();
// Get accuracy on training set
let accuracy = tree.predict(&dataset).confusion_matrix(&dataset).unwrap().accuracy();

assert!(accuracy > 0.9);

Implementations

impl<F: Float, L: Label + Debug> DecisionTree<F, L>[src]

pub fn params() -> DecisionTreeParams<F, L>[src]

Defaults are provided if the optional parameters are not specified:

  • split_quality = SplitQuality::Gini
  • max_depth = None
  • min_weight_split = 2.0
  • min_weight_leaf = 1.0
  • min_impurity_decrease = 0.00001

pub fn iter_nodes(&self) -> NodeIter<'_, F, L>

Notable traits for NodeIter<'a, F, L>

impl<'a, F: Float, L: Debug + Label> Iterator for NodeIter<'a, F, L> type Item = &'a TreeNode<F, L>;
[src]

Create a node iterator in level-order (BFT)

pub fn features(&self) -> Vec<usize>[src]

Return features_idx of this tree (BFT)

pub fn mean_impurity_decrease(&self) -> Vec<F>[src]

Return the mean impurity decrease for each feature

pub fn relative_impurity_decrease(&self) -> Vec<F>[src]

Return the relative impurity decrease for each feature

pub fn feature_importance(&self) -> Vec<F>[src]

Return the feature importance, i.e. the relative impurity decrease, for each feature

pub fn root_node(&self) -> &TreeNode<F, L>[src]

Return root node of the tree

pub fn max_depth(&self) -> usize[src]

Return max depth of the tree

pub fn num_leaves(&self) -> usize[src]

Return the number of leaves in this tree

pub fn export_to_tikz(&self) -> Tikz<'_, F, L>[src]

Generates a Tikz structure to print the fitted tree in Tex using tikz and forest, with the following default parameters:

  • legend=false
  • complete=true

Trait Implementations

impl<F: Debug + Float, L: Debug + Label> Debug for DecisionTree<F, L>[src]

impl<F: Float, L: Label, D: Data<Elem = F>> PredictRef<ArrayBase<D, Dim<[usize; 2]>>, ArrayBase<OwnedRepr<L>, Dim<[usize; 1]>>> for DecisionTree<F, L>[src]

fn predict_ref<'a>(&'a self, x: &ArrayBase<D, Ix2>) -> Array1<L>[src]

Make predictions for each row of a matrix of features x.

Auto Trait Implementations

impl<F, L> RefUnwindSafe for DecisionTree<F, L> where
    F: RefUnwindSafe,
    L: RefUnwindSafe

impl<F, L> Send for DecisionTree<F, L> where
    L: Send

impl<F, L> Sync for DecisionTree<F, L> where
    L: Sync

impl<F, L> Unpin for DecisionTree<F, L> where
    F: Unpin,
    L: Unpin

impl<F, L> UnwindSafe for DecisionTree<F, L> where
    F: UnwindSafe,
    L: UnwindSafe

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T> Pointable for T

type Init = T

The type for initializers.

impl<'a, F, D, T, O> Predict<&'a ArrayBase<D, Dim<[usize; 2]>>, T> for O where
    F: Float,
    D: Data<Elem = F>,
    O: PredictRef<ArrayBase<D, Dim<[usize; 2]>>, T>, 
[src]

impl<'a, F, R, T, S, O> Predict<&'a DatasetBase<R, T>, S> for O where
    F: Float,
    R: Records<Elem = F>,
    O: PredictRef<R, S>, 
[src]

impl<F, D, T, O> Predict<ArrayBase<D, Dim<[usize; 2]>>, DatasetBase<ArrayBase<D, Dim<[usize; 2]>>, T>> for O where
    F: Float,
    D: Data<Elem = F>,
    O: PredictRef<ArrayBase<D, Dim<[usize; 2]>>, T>, 
[src]

impl<F, R, T, S, O> Predict<DatasetBase<R, T>, DatasetBase<R, S>> for O where
    F: Float,
    R: Records<Elem = F>,
    O: PredictRef<R, S>, 
[src]

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<V, T> VZip<V> for T where
    V: MultiLane<T>,