[][src]Trait opencv::ml::prelude::SVMSGD

pub trait SVMSGD: StatModel {
    pub fn as_raw_SVMSGD(&self) -> *const c_void;
pub fn as_raw_mut_SVMSGD(&mut self) -> *mut c_void; pub fn get_weights(&mut self) -> Result<Mat> { ... }
pub fn get_shift(&mut self) -> Result<f32> { ... }
pub fn set_optimal_parameters(
        &mut self,
        svmsgd_type: i32,
        margin_type: i32
    ) -> Result<()> { ... }
pub fn get_svmsgd_type(&self) -> Result<i32> { ... }
pub fn set_svmsgd_type(&mut self, svmsgd_type: i32) -> Result<()> { ... }
pub fn get_margin_type(&self) -> Result<i32> { ... }
pub fn set_margin_type(&mut self, margin_type: i32) -> Result<()> { ... }
pub fn get_margin_regularization(&self) -> Result<f32> { ... }
pub fn set_margin_regularization(
        &mut self,
        margin_regularization: f32
    ) -> Result<()> { ... }
pub fn get_initial_step_size(&self) -> Result<f32> { ... }
pub fn set_initial_step_size(
        &mut self,
        initial_step_size: f32
    ) -> Result<()> { ... }
pub fn get_step_decreasing_power(&self) -> Result<f32> { ... }
pub fn set_step_decreasing_power(
        &mut self,
        step_decreasing_power: f32
    ) -> Result<()> { ... }
pub fn get_term_criteria(&self) -> Result<TermCriteria> { ... }
pub fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()> { ... } }

! Stochastic Gradient Descent SVM classifier

SVMSGD provides a fast and easy-to-use implementation of the SVM classifier using the Stochastic Gradient Descent approach, as presented in bottou2010large.

The classifier has following parameters:

  • model type,
  • margin type,
  • margin regularization (inline formula),
  • initial step size (inline formula),
  • step decreasing power (inline formula),
  • and termination criteria.

The model type may have one of the following values: \ref SGD and \ref ASGD.

  • \ref SGD is the classic version of SVMSGD classifier: every next step is calculated by the formula block formula where

    • inline formula is the weights vector for decision function at step inline formula,
    • inline formula is the step size of model parameters at the iteration inline formula, it is decreased on each step by the formula inline formula
    • inline formula is the target functional from SVM task for sample with number inline formula, this sample is chosen stochastically on each step of the algorithm.
  • \ref ASGD is Average Stochastic Gradient Descent SVM Classifier. ASGD classifier averages weights vector on each step of algorithm by the formula inline formula

The recommended model type is ASGD (following bottou2010large).

The margin type may have one of the following values: \ref SOFT_MARGIN or \ref HARD_MARGIN.

  • You should use \ref HARD_MARGIN type, if you have linearly separable sets.
  • You should use \ref SOFT_MARGIN type, if you have non-linearly separable sets or sets with outliers.
  • In the general case (if you know nothing about linear separability of your sets), use SOFT_MARGIN.

The other parameters may be described as follows:

  • Margin regularization parameter is responsible for weights decreasing at each step and for the strength of restrictions on outliers (the less the parameter, the less probability that an outlier will be ignored). Recommended value for SGD model is 0.0001, for ASGD model is 0.00001.

  • Initial step size parameter is the initial value for the step size inline formula. You will have to find the best initial step for your problem.

  • Step decreasing power is the power parameter for inline formula decreasing by the formula, mentioned above. Recommended value for SGD model is 1, for ASGD model is 0.75.

  • Termination criteria can be TermCriteria::COUNT, TermCriteria::EPS or TermCriteria::COUNT + TermCriteria::EPS. You will have to find the best termination criteria for your problem.

Note that the parameters margin regularization, initial step size, and step decreasing power should be positive.

To use SVMSGD algorithm do as follows:

  • first, create the SVMSGD object. The algorithm will set optimal parameters by default, but you can set your own parameters via functions setSvmsgdType(), setMarginType(), setMarginRegularization(), setInitialStepSize(), and setStepDecreasingPower().

  • then the SVM model can be trained using the train features and the correspondent labels by the method train().

  • after that, the label of a new feature vector can be predicted using the method predict().

// Create empty object
cv::Ptr<SVMSGD> svmsgd = SVMSGD::create();
 
// Train the Stochastic Gradient Descent SVM
svmsgd->train(trainData);
 
// Predict labels for the new samples
svmsgd->predict(samples, responses);

Required methods

Loading content...

Provided methods

pub fn get_weights(&mut self) -> Result<Mat>[src]

Returns

the weights of the trained model (decision function f(x) = weights * x + shift).

pub fn get_shift(&mut self) -> Result<f32>[src]

Returns

the shift of the trained model (decision function f(x) = weights * x + shift).

pub fn set_optimal_parameters(
    &mut self,
    svmsgd_type: i32,
    margin_type: i32
) -> Result<()>
[src]

Function sets optimal parameters values for chosen SVM SGD model.

Parameters

  • svmsgdType: is the type of SVMSGD classifier.
  • marginType: is the type of margin constraint.

C++ default parameters

  • svmsgd_type: SVMSGD::ASGD
  • margin_type: SVMSGD::SOFT_MARGIN

pub fn get_svmsgd_type(&self) -> Result<i32>[src]

%Algorithm type, one of SVMSGD::SvmsgdType.

See also

setSvmsgdType

pub fn set_svmsgd_type(&mut self, svmsgd_type: i32) -> Result<()>[src]

%Algorithm type, one of SVMSGD::SvmsgdType.

See also

setSvmsgdType getSvmsgdType

pub fn get_margin_type(&self) -> Result<i32>[src]

%Margin type, one of SVMSGD::MarginType.

See also

setMarginType

pub fn set_margin_type(&mut self, margin_type: i32) -> Result<()>[src]

%Margin type, one of SVMSGD::MarginType.

See also

setMarginType getMarginType

pub fn get_margin_regularization(&self) -> Result<f32>[src]

Parameter marginRegularization of a %SVMSGD optimization problem.

See also

setMarginRegularization

pub fn set_margin_regularization(
    &mut self,
    margin_regularization: f32
) -> Result<()>
[src]

Parameter marginRegularization of a %SVMSGD optimization problem.

See also

setMarginRegularization getMarginRegularization

pub fn get_initial_step_size(&self) -> Result<f32>[src]

Parameter initialStepSize of a %SVMSGD optimization problem.

See also

setInitialStepSize

pub fn set_initial_step_size(&mut self, initial_step_size: f32) -> Result<()>[src]

Parameter initialStepSize of a %SVMSGD optimization problem.

See also

setInitialStepSize getInitialStepSize

pub fn get_step_decreasing_power(&self) -> Result<f32>[src]

Parameter stepDecreasingPower of a %SVMSGD optimization problem.

See also

setStepDecreasingPower

pub fn set_step_decreasing_power(
    &mut self,
    step_decreasing_power: f32
) -> Result<()>
[src]

Parameter stepDecreasingPower of a %SVMSGD optimization problem.

See also

setStepDecreasingPower getStepDecreasingPower

pub fn get_term_criteria(&self) -> Result<TermCriteria>[src]

Termination criteria of the training algorithm. You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon).

See also

setTermCriteria

pub fn set_term_criteria(&mut self, val: TermCriteria) -> Result<()>[src]

Termination criteria of the training algorithm. You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon).

See also

setTermCriteria getTermCriteria

Loading content...

Implementations

impl<'_> dyn SVMSGD + '_[src]

pub fn create() -> Result<Ptr<dyn SVMSGD>>[src]

Creates empty model. Use StatModel::train to train the model. Since %SVMSGD has several parameters, you may want to find the best parameters for your problem or use setOptimalParameters() to set some default parameters.

pub fn load(filepath: &str, node_name: &str) -> Result<Ptr<dyn SVMSGD>>[src]

Loads and creates a serialized SVMSGD from a file

Use SVMSGD::save to serialize and store an SVMSGD to disk. Load the SVMSGD from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

Parameters

  • filepath: path to serialized SVMSGD
  • nodeName: name of node containing the classifier

C++ default parameters

  • node_name: String()

Implementors

impl SVMSGD for PtrOfSVMSGD[src]

Loading content...