Type Definition opencv::types::PtrOfDTrees
source · [−]Implementations
sourceimpl PtrOfDTrees
impl PtrOfDTrees
pub fn as_raw_PtrOfDTrees(&self) -> *const c_void
pub fn as_raw_mut_PtrOfDTrees(&mut self) -> *mut c_void
Trait Implementations
sourceimpl AlgorithmTrait for PtrOfDTrees
impl AlgorithmTrait for PtrOfDTrees
sourceimpl AlgorithmTraitConst for PtrOfDTrees
impl AlgorithmTraitConst for PtrOfDTrees
fn as_raw_Algorithm(&self) -> *const c_void
sourcefn write(&self, fs: &mut FileStorage) -> Result<()>
fn write(&self, fs: &mut FileStorage) -> Result<()>
Stores algorithm parameters in a file storage
sourcefn write_with_name(&self, fs: &Ptr<FileStorage>, name: &str) -> Result<()>
fn write_with_name(&self, fs: &Ptr<FileStorage>, name: &str) -> Result<()>
simplified API for language bindings
Stores algorithm parameters in a file storage Read more
sourcefn empty(&self) -> Result<bool>
fn empty(&self) -> Result<bool>
Returns true if the Algorithm is empty (e.g. in the very beginning or after unsuccessful read
sourcefn save(&self, filename: &str) -> Result<()>
fn save(&self, filename: &str) -> Result<()>
Saves the algorithm to a file.
In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs). Read more
sourcefn get_default_name(&self) -> Result<String>
fn get_default_name(&self) -> Result<String>
Returns the algorithm string identifier.
This string is used as top level xml/yml node tag when the object is saved to a file or string. Read more
sourceimpl DTrees for PtrOfDTrees
impl DTrees for PtrOfDTrees
fn as_raw_mut_DTrees(&mut self) -> *mut c_void
sourcefn set_max_categories(&mut self, val: i32) -> Result<()>
fn set_max_categories(&mut self, val: i32) -> Result<()>
Cluster possible values of a categorical variable into K<=maxCategories clusters to
find a suboptimal split.
If a discrete variable, on which the training procedure tries to make a split, takes more than
maxCategories values, the precise best subset estimation may take a very long time because the
algorithm is exponential. Instead, many decision trees engines (including our implementation)
try to find sub-optimal split in this case by clustering all the samples into maxCategories
clusters that is some categories are merged together. The clustering is applied only in n >
2-class classification problems for categorical variables with N > max_categories possible
values. In case of regression and 2-class classification the optimal split can be found
efficiently without employing clustering, thus the parameter is not used in these cases.
Default value is 10. Read more
sourcefn set_max_depth(&mut self, val: i32) -> Result<()>
fn set_max_depth(&mut self, val: i32) -> Result<()>
The maximum possible depth of the tree.
That is the training algorithms attempts to split a node while its depth is less than maxDepth.
The root node has zero depth. The actual depth may be smaller if the other termination criteria
are met (see the outline of the training procedure @ref ml_intro_trees “here”), and/or if the
tree is pruned. Default value is INT_MAX. Read more
sourcefn set_min_sample_count(&mut self, val: i32) -> Result<()>
fn set_min_sample_count(&mut self, val: i32) -> Result<()>
If the number of samples in a node is less than this parameter then the node will not be split. Read more
sourcefn set_cv_folds(&mut self, val: i32) -> Result<()>
fn set_cv_folds(&mut self, val: i32) -> Result<()>
If CVFolds > 1 then algorithms prunes the built decision tree using K-fold
cross-validation procedure where K is equal to CVFolds.
Default value is 10. Read more
sourcefn set_use_surrogates(&mut self, val: bool) -> Result<()>
fn set_use_surrogates(&mut self, val: bool) -> Result<()>
If true then surrogate splits will be built.
These splits allow to work with missing data and compute variable importance correctly.
Default value is false. Read more
sourcefn set_use1_se_rule(&mut self, val: bool) -> Result<()>
fn set_use1_se_rule(&mut self, val: bool) -> Result<()>
If true then a pruning will be harsher.
This will make a tree more compact and more resistant to the training data noise but a bit less
accurate. Default value is true. Read more
sourcefn set_truncate_pruned_tree(&mut self, val: bool) -> Result<()>
fn set_truncate_pruned_tree(&mut self, val: bool) -> Result<()>
If true then pruned branches are physically removed from the tree.
Otherwise they are retained and it is possible to get results from the original unpruned (or
pruned less aggressively) tree. Default value is true. Read more
sourcefn set_regression_accuracy(&mut self, val: f32) -> Result<()>
fn set_regression_accuracy(&mut self, val: f32) -> Result<()>
Termination criteria for regression trees.
If all absolute differences between an estimated value in a node and values of train samples
in this node are less than this parameter then the node will not be split further. Default
value is 0.01f Read more
sourceimpl DTreesConst for PtrOfDTrees
impl DTreesConst for PtrOfDTrees
fn as_raw_DTrees(&self) -> *const c_void
sourcefn get_max_categories(&self) -> Result<i32>
fn get_max_categories(&self) -> Result<i32>
Cluster possible values of a categorical variable into K<=maxCategories clusters to
find a suboptimal split.
If a discrete variable, on which the training procedure tries to make a split, takes more than
maxCategories values, the precise best subset estimation may take a very long time because the
algorithm is exponential. Instead, many decision trees engines (including our implementation)
try to find sub-optimal split in this case by clustering all the samples into maxCategories
clusters that is some categories are merged together. The clustering is applied only in n >
2-class classification problems for categorical variables with N > max_categories possible
values. In case of regression and 2-class classification the optimal split can be found
efficiently without employing clustering, thus the parameter is not used in these cases.
Default value is 10. Read more
sourcefn get_max_depth(&self) -> Result<i32>
fn get_max_depth(&self) -> Result<i32>
The maximum possible depth of the tree.
That is the training algorithms attempts to split a node while its depth is less than maxDepth.
The root node has zero depth. The actual depth may be smaller if the other termination criteria
are met (see the outline of the training procedure @ref ml_intro_trees “here”), and/or if the
tree is pruned. Default value is INT_MAX. Read more
sourcefn get_min_sample_count(&self) -> Result<i32>
fn get_min_sample_count(&self) -> Result<i32>
If the number of samples in a node is less than this parameter then the node will not be split. Read more
sourcefn get_cv_folds(&self) -> Result<i32>
fn get_cv_folds(&self) -> Result<i32>
If CVFolds > 1 then algorithms prunes the built decision tree using K-fold
cross-validation procedure where K is equal to CVFolds.
Default value is 10. Read more
sourcefn get_use_surrogates(&self) -> Result<bool>
fn get_use_surrogates(&self) -> Result<bool>
If true then surrogate splits will be built.
These splits allow to work with missing data and compute variable importance correctly.
Default value is false. Read more
sourcefn get_use1_se_rule(&self) -> Result<bool>
fn get_use1_se_rule(&self) -> Result<bool>
If true then a pruning will be harsher.
This will make a tree more compact and more resistant to the training data noise but a bit less
accurate. Default value is true. Read more
sourcefn get_truncate_pruned_tree(&self) -> Result<bool>
fn get_truncate_pruned_tree(&self) -> Result<bool>
If true then pruned branches are physically removed from the tree.
Otherwise they are retained and it is possible to get results from the original unpruned (or
pruned less aggressively) tree. Default value is true. Read more
sourcefn get_regression_accuracy(&self) -> Result<f32>
fn get_regression_accuracy(&self) -> Result<f32>
Termination criteria for regression trees.
If all absolute differences between an estimated value in a node and values of train samples
in this node are less than this parameter then the node will not be split further. Default
value is 0.01f Read more
sourcefn get_priors(&self) -> Result<Mat>
fn get_priors(&self) -> Result<Mat>
The array of a priori class probabilities, sorted by the class label value. Read more
sourcefn get_splits(&self) -> Result<Vector<DTrees_Split>>
fn get_splits(&self) -> Result<Vector<DTrees_Split>>
Returns all the splits Read more
sourceimpl StatModel for PtrOfDTrees
impl StatModel for PtrOfDTrees
fn as_raw_mut_StatModel(&mut self) -> *mut c_void
sourcefn train_with_data(
&mut self,
train_data: &Ptr<dyn TrainData>,
flags: i32
) -> Result<bool>
fn train_with_data(
&mut self,
train_data: &Ptr<dyn TrainData>,
flags: i32
) -> Result<bool>
Trains the statistical model Read more
sourcefn train(
&mut self,
samples: &dyn ToInputArray,
layout: i32,
responses: &dyn ToInputArray
) -> Result<bool>
fn train(
&mut self,
samples: &dyn ToInputArray,
layout: i32,
responses: &dyn ToInputArray
) -> Result<bool>
Trains the statistical model Read more
sourceimpl StatModelConst for PtrOfDTrees
impl StatModelConst for PtrOfDTrees
fn as_raw_StatModel(&self) -> *const c_void
sourcefn get_var_count(&self) -> Result<i32>
fn get_var_count(&self) -> Result<i32>
Returns the number of variables in training samples
fn empty(&self) -> Result<bool>
sourcefn is_trained(&self) -> Result<bool>
fn is_trained(&self) -> Result<bool>
Returns true if the model is trained
sourcefn is_classifier(&self) -> Result<bool>
fn is_classifier(&self) -> Result<bool>
Returns true if the model is classifier
sourcefn calc_error(
&self,
data: &Ptr<dyn TrainData>,
test: bool,
resp: &mut dyn ToOutputArray
) -> Result<f32>
fn calc_error(
&self,
data: &Ptr<dyn TrainData>,
test: bool,
resp: &mut dyn ToOutputArray
) -> Result<f32>
Computes error on the training or test dataset Read more
sourcefn predict(
&self,
samples: &dyn ToInputArray,
results: &mut dyn ToOutputArray,
flags: i32
) -> Result<f32>
fn predict(
&self,
samples: &dyn ToInputArray,
results: &mut dyn ToOutputArray,
flags: i32
) -> Result<f32>
Predicts response(s) for the provided sample(s) Read more