pub trait SVMTrait: SVMTraitConst + StatModelTrait {
Show 16 methods
// Required method
fn as_raw_mut_SVM(&mut self) -> *mut c_void;
// Provided methods
fn set_type(&mut self, val: i32) -> Result<()> { ... }
fn set_gamma(&mut self, val: f64) -> Result<()> { ... }
fn set_coef0(&mut self, val: f64) -> Result<()> { ... }
fn set_degree(&mut self, val: f64) -> Result<()> { ... }
fn set_c(&mut self, val: f64) -> Result<()> { ... }
fn set_nu(&mut self, val: f64) -> Result<()> { ... }
fn set_p(&mut self, val: f64) -> Result<()> { ... }
fn set_class_weights(&mut self, val: &impl MatTraitConst) -> Result<()> { ... }
fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()> { ... }
fn set_kernel(&mut self, kernel_type: i32) -> Result<()> { ... }
fn set_custom_kernel(&mut self, _kernel: &Ptr<SVM_Kernel>) -> Result<()> { ... }
fn train_auto(
&mut self,
data: &Ptr<TrainData>,
k_fold: i32,
cgrid: impl ParamGridTrait,
gamma_grid: impl ParamGridTrait,
p_grid: impl ParamGridTrait,
nu_grid: impl ParamGridTrait,
coeff_grid: impl ParamGridTrait,
degree_grid: impl ParamGridTrait,
balanced: bool,
) -> Result<bool> { ... }
fn train_auto_def(&mut self, data: &Ptr<TrainData>) -> Result<bool> { ... }
fn train_auto_with_data(
&mut self,
samples: &impl ToInputArray,
layout: i32,
responses: &impl ToInputArray,
k_fold: i32,
cgrid: Ptr<ParamGrid>,
gamma_grid: Ptr<ParamGrid>,
p_grid: Ptr<ParamGrid>,
nu_grid: Ptr<ParamGrid>,
coeff_grid: Ptr<ParamGrid>,
degree_grid: Ptr<ParamGrid>,
balanced: bool,
) -> Result<bool> { ... }
fn train_auto_with_data_def(
&mut self,
samples: &impl ToInputArray,
layout: i32,
responses: &impl ToInputArray,
) -> Result<bool> { ... }
}
Expand description
Mutable methods for crate::ml::SVM
Required Methods§
fn as_raw_mut_SVM(&mut self) -> *mut c_void
Provided Methods§
Sourcefn set_gamma(&mut self, val: f64) -> Result<()>
fn set_gamma(&mut self, val: f64) -> Result<()>
Parameter of a kernel function.
For SVM::POLY, SVM::RBF, SVM::SIGMOID or SVM::CHI2. Default value is 1.
§See also
setGamma getGamma
Sourcefn set_coef0(&mut self, val: f64) -> Result<()>
fn set_coef0(&mut self, val: f64) -> Result<()>
Parameter coef0 of a kernel function. For SVM::POLY or SVM::SIGMOID. Default value is 0.
§See also
setCoef0 getCoef0
Sourcefn set_degree(&mut self, val: f64) -> Result<()>
fn set_degree(&mut self, val: f64) -> Result<()>
Parameter degree of a kernel function. For SVM::POLY. Default value is 0.
§See also
setDegree getDegree
Sourcefn set_c(&mut self, val: f64) -> Result<()>
fn set_c(&mut self, val: f64) -> Result<()>
Parameter C of a %SVM optimization problem. For SVM::C_SVC, SVM::EPS_SVR or SVM::NU_SVR. Default value is 0.
§See also
setC getC
Sourcefn set_nu(&mut self, val: f64) -> Result<()>
fn set_nu(&mut self, val: f64) -> Result<()>
Parameter of a %SVM optimization problem.
For SVM::NU_SVC, SVM::ONE_CLASS or SVM::NU_SVR. Default value is 0.
§See also
setNu getNu
Sourcefn set_class_weights(&mut self, val: &impl MatTraitConst) -> Result<()>
fn set_class_weights(&mut self, val: &impl MatTraitConst) -> Result<()>
Optional weights in the SVM::C_SVC problem, assigned to particular classes.
They are multiplied by C so the parameter C of class i becomes classWeights(i) * C
. Thus
these weights affect the misclassification penalty for different classes. The larger weight,
the larger penalty on misclassification of data from the corresponding class. Default value is
empty Mat.
§See also
setClassWeights getClassWeights
Sourcefn set_term_criteria(&mut self, val: TermCriteria) -> Result<()>
fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()>
Termination criteria of the iterative %SVM training procedure which solves a partial
case of constrained quadratic optimization problem.
You can specify tolerance and/or the maximum number of iterations. Default value is
TermCriteria( TermCriteria::MAX_ITER + TermCriteria::EPS, 1000, FLT_EPSILON )
;
§See also
setTermCriteria getTermCriteria
Sourcefn set_kernel(&mut self, kernel_type: i32) -> Result<()>
fn set_kernel(&mut self, kernel_type: i32) -> Result<()>
Initialize with one of predefined kernels. See SVM::KernelTypes.
Sourcefn set_custom_kernel(&mut self, _kernel: &Ptr<SVM_Kernel>) -> Result<()>
fn set_custom_kernel(&mut self, _kernel: &Ptr<SVM_Kernel>) -> Result<()>
Initialize with custom kernel. See SVM::Kernel class for implementation details
Sourcefn train_auto(
&mut self,
data: &Ptr<TrainData>,
k_fold: i32,
cgrid: impl ParamGridTrait,
gamma_grid: impl ParamGridTrait,
p_grid: impl ParamGridTrait,
nu_grid: impl ParamGridTrait,
coeff_grid: impl ParamGridTrait,
degree_grid: impl ParamGridTrait,
balanced: bool,
) -> Result<bool>
fn train_auto( &mut self, data: &Ptr<TrainData>, k_fold: i32, cgrid: impl ParamGridTrait, gamma_grid: impl ParamGridTrait, p_grid: impl ParamGridTrait, nu_grid: impl ParamGridTrait, coeff_grid: impl ParamGridTrait, degree_grid: impl ParamGridTrait, balanced: bool, ) -> Result<bool>
Trains an %SVM with optimal parameters.
§Parameters
- data: the training data that can be constructed using TrainData::create or TrainData::loadFromCSV.
- kFold: Cross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is executed kFold times.
- Cgrid: grid for C
- gammaGrid: grid for gamma
- pGrid: grid for p
- nuGrid: grid for nu
- coeffGrid: grid for coeff
- degreeGrid: grid for degree
- balanced: If true and the problem is 2-class classification then the method creates more balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
If there is no need to optimize a parameter, the corresponding grid step should be set to any
value less than or equal to 1. For example, to avoid optimization in gamma, set gammaGrid.step = 0
, gammaGrid.minVal
, gamma_grid.maxVal
as arbitrary numbers. In this case, the value
Gamma
is taken for gamma.
And, finally, if the optimization in a parameter is required but the corresponding grid is
unknown, you may call the function SVM::getDefaultGrid. To generate a grid, for example, for
gamma, call SVM::getDefaultGrid(SVM::GAMMA)
.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
§C++ default parameters
- k_fold: 10
- cgrid: getDefaultGrid(C)
- gamma_grid: getDefaultGrid(GAMMA)
- p_grid: getDefaultGrid(P)
- nu_grid: getDefaultGrid(NU)
- coeff_grid: getDefaultGrid(COEF)
- degree_grid: getDefaultGrid(DEGREE)
- balanced: false
Sourcefn train_auto_def(&mut self, data: &Ptr<TrainData>) -> Result<bool>
fn train_auto_def(&mut self, data: &Ptr<TrainData>) -> Result<bool>
Trains an %SVM with optimal parameters.
§Parameters
- data: the training data that can be constructed using TrainData::create or TrainData::loadFromCSV.
- kFold: Cross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is executed kFold times.
- Cgrid: grid for C
- gammaGrid: grid for gamma
- pGrid: grid for p
- nuGrid: grid for nu
- coeffGrid: grid for coeff
- degreeGrid: grid for degree
- balanced: If true and the problem is 2-class classification then the method creates more balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
If there is no need to optimize a parameter, the corresponding grid step should be set to any
value less than or equal to 1. For example, to avoid optimization in gamma, set gammaGrid.step = 0
, gammaGrid.minVal
, gamma_grid.maxVal
as arbitrary numbers. In this case, the value
Gamma
is taken for gamma.
And, finally, if the optimization in a parameter is required but the corresponding grid is
unknown, you may call the function SVM::getDefaultGrid. To generate a grid, for example, for
gamma, call SVM::getDefaultGrid(SVM::GAMMA)
.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
§Note
This alternative version of SVMTrait::train_auto function uses the following default values for its arguments:
- k_fold: 10
- cgrid: getDefaultGrid(C)
- gamma_grid: getDefaultGrid(GAMMA)
- p_grid: getDefaultGrid(P)
- nu_grid: getDefaultGrid(NU)
- coeff_grid: getDefaultGrid(COEF)
- degree_grid: getDefaultGrid(DEGREE)
- balanced: false
Sourcefn train_auto_with_data(
&mut self,
samples: &impl ToInputArray,
layout: i32,
responses: &impl ToInputArray,
k_fold: i32,
cgrid: Ptr<ParamGrid>,
gamma_grid: Ptr<ParamGrid>,
p_grid: Ptr<ParamGrid>,
nu_grid: Ptr<ParamGrid>,
coeff_grid: Ptr<ParamGrid>,
degree_grid: Ptr<ParamGrid>,
balanced: bool,
) -> Result<bool>
fn train_auto_with_data( &mut self, samples: &impl ToInputArray, layout: i32, responses: &impl ToInputArray, k_fold: i32, cgrid: Ptr<ParamGrid>, gamma_grid: Ptr<ParamGrid>, p_grid: Ptr<ParamGrid>, nu_grid: Ptr<ParamGrid>, coeff_grid: Ptr<ParamGrid>, degree_grid: Ptr<ParamGrid>, balanced: bool, ) -> Result<bool>
Trains an %SVM with optimal parameters
§Parameters
- samples: training samples
- layout: See ml::SampleTypes.
- responses: vector of responses associated with the training samples.
- kFold: Cross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
- Cgrid: grid for C
- gammaGrid: grid for gamma
- pGrid: grid for p
- nuGrid: grid for nu
- coeffGrid: grid for coeff
- degreeGrid: grid for degree
- balanced: If true and the problem is 2-class classification then the method creates more balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
§C++ default parameters
- k_fold: 10
- cgrid: SVM::getDefaultGridPtr(SVM::C)
- gamma_grid: SVM::getDefaultGridPtr(SVM::GAMMA)
- p_grid: SVM::getDefaultGridPtr(SVM::P)
- nu_grid: SVM::getDefaultGridPtr(SVM::NU)
- coeff_grid: SVM::getDefaultGridPtr(SVM::COEF)
- degree_grid: SVM::getDefaultGridPtr(SVM::DEGREE)
- balanced: false
Sourcefn train_auto_with_data_def(
&mut self,
samples: &impl ToInputArray,
layout: i32,
responses: &impl ToInputArray,
) -> Result<bool>
fn train_auto_with_data_def( &mut self, samples: &impl ToInputArray, layout: i32, responses: &impl ToInputArray, ) -> Result<bool>
Trains an %SVM with optimal parameters
§Parameters
- samples: training samples
- layout: See ml::SampleTypes.
- responses: vector of responses associated with the training samples.
- kFold: Cross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
- Cgrid: grid for C
- gammaGrid: grid for gamma
- pGrid: grid for p
- nuGrid: grid for nu
- coeffGrid: grid for coeff
- degreeGrid: grid for degree
- balanced: If true and the problem is 2-class classification then the method creates more balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
§Note
This alternative version of SVMTrait::train_auto_with_data function uses the following default values for its arguments:
- k_fold: 10
- cgrid: SVM::getDefaultGridPtr(SVM::C)
- gamma_grid: SVM::getDefaultGridPtr(SVM::GAMMA)
- p_grid: SVM::getDefaultGridPtr(SVM::P)
- nu_grid: SVM::getDefaultGridPtr(SVM::NU)
- coeff_grid: SVM::getDefaultGridPtr(SVM::COEF)
- degree_grid: SVM::getDefaultGridPtr(SVM::DEGREE)
- balanced: false
Dyn Compatibility§
This trait is not dyn compatible.
In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.