[][src]Trait opencv::ml::RTrees

pub trait RTrees: DTrees {
    pub fn as_raw_RTrees(&self) -> *const c_void;
pub fn as_raw_mut_RTrees(&mut self) -> *mut c_void; pub fn get_calculate_var_importance(&self) -> Result<bool> { ... }
pub fn set_calculate_var_importance(&mut self, val: bool) -> Result<()> { ... }
pub fn get_active_var_count(&self) -> Result<i32> { ... }
pub fn set_active_var_count(&mut self, val: i32) -> Result<()> { ... }
pub fn get_term_criteria(&self) -> Result<TermCriteria> { ... }
pub fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()> { ... }
pub fn get_var_importance(&self) -> Result<Mat> { ... }
pub fn get_votes(
        &self,
        samples: &dyn ToInputArray,
        results: &mut dyn ToOutputArray,
        flags: i32
    ) -> Result<()> { ... }
pub fn get_oob_error(&self) -> Result<f64> { ... } }

The class implements the random forest predictor.

See also

@ref ml_intro_rtrees

Required methods

Loading content...

Provided methods

pub fn get_calculate_var_importance(&self) -> Result<bool>[src]

If true then variable importance will be calculated and then it can be retrieved by RTrees::getVarImportance. Default value is false.

See also

setCalculateVarImportance

pub fn set_calculate_var_importance(&mut self, val: bool) -> Result<()>[src]

If true then variable importance will be calculated and then it can be retrieved by RTrees::getVarImportance. Default value is false.

See also

setCalculateVarImportance getCalculateVarImportance

pub fn get_active_var_count(&self) -> Result<i32>[src]

The size of the randomly selected subset of features at each tree node and that are used to find the best split(s). If you set it to 0 then the size will be set to the square root of the total number of features. Default value is 0.

See also

setActiveVarCount

pub fn set_active_var_count(&mut self, val: i32) -> Result<()>[src]

The size of the randomly selected subset of features at each tree node and that are used to find the best split(s). If you set it to 0 then the size will be set to the square root of the total number of features. Default value is 0.

See also

setActiveVarCount getActiveVarCount

pub fn get_term_criteria(&self) -> Result<TermCriteria>[src]

The termination criteria that specifies when the training algorithm stops. Either when the specified number of trees is trained and added to the ensemble or when sufficient accuracy (measured as OOB error) is achieved. Typically the more trees you have the better the accuracy. However, the improvement in accuracy generally diminishes and asymptotes pass a certain number of trees. Also to keep in mind, the number of tree increases the prediction time linearly. Default value is TermCriteria(TermCriteria::MAX_ITERS + TermCriteria::EPS, 50, 0.1)

See also

setTermCriteria

pub fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()>[src]

The termination criteria that specifies when the training algorithm stops. Either when the specified number of trees is trained and added to the ensemble or when sufficient accuracy (measured as OOB error) is achieved. Typically the more trees you have the better the accuracy. However, the improvement in accuracy generally diminishes and asymptotes pass a certain number of trees. Also to keep in mind, the number of tree increases the prediction time linearly. Default value is TermCriteria(TermCriteria::MAX_ITERS + TermCriteria::EPS, 50, 0.1)

See also

setTermCriteria getTermCriteria

pub fn get_var_importance(&self) -> Result<Mat>[src]

Returns the variable importance array. The method returns the variable importance vector, computed at the training stage when CalculateVarImportance is set to true. If this flag was set to false, the empty matrix is returned.

pub fn get_votes(
    &self,
    samples: &dyn ToInputArray,
    results: &mut dyn ToOutputArray,
    flags: i32
) -> Result<()>
[src]

Returns the result of each individual tree in the forest. In case the model is a regression problem, the method will return each of the trees' results for each of the sample cases. If the model is a classifier, it will return a Mat with samples + 1 rows, where the first row gives the class number and the following rows return the votes each class had for each sample.

Parameters

  • samples: Array containing the samples for which votes will be calculated.
  • results: Array where the result of the calculation will be written.
  • flags: Flags for defining the type of RTrees.

pub fn get_oob_error(&self) -> Result<f64>[src]

Loading content...

Implementations

impl<'_> dyn RTrees + '_[src]

pub fn create() -> Result<Ptr<dyn RTrees>>[src]

Creates the empty model. Use StatModel::train to train the model, StatModel::train to create and train the model, Algorithm::load to load the pre-trained model.

pub fn load(filepath: &str, node_name: &str) -> Result<Ptr<dyn RTrees>>[src]

Loads and creates a serialized RTree from a file

Use RTree::save to serialize and store an RTree to disk. Load the RTree from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

Parameters

  • filepath: path to serialized RTree
  • nodeName: name of node containing the classifier

C++ default parameters

  • node_name: String()

Implementors

impl RTrees for PtrOfRTrees[src]

Loading content...