[−][src]Module smartcore::svm::svc
Support Vector Classifier.
Support Vector Classifier (SVC) is a binary classifier that uses an optimal hyperplane to separate the points in the input variable space by their class.
During training, SVC chooses a Maximal-Margin hyperplane that can separate all training instances with the largest margin. The margin is calculated as the perpendicular distance from the boundary to only the closest points. Hence, only these points are relevant in defining the hyperplane and in the construction of the classifier. These points are called the support vectors.
While SVC selects a hyperplane with the largest margin it allows some points in the training data to violate the separating boundary.
The parameter C
> 0 gives you control over how SVC will handle violating points. The bigger the value of this parameter the more we penalize the algorithm
for incorrectly classified points. In other words, setting this parameter to a small value will result in a classifier that allows for a big number
of misclassified samples. Mathematically, SVC optimization problem can be defined as:
\[\underset{w, \zeta}{minimize} \space \space \frac{1}{2} \lVert \vec{w} \rVert^2 + C\sum_{i=1}^m \zeta_i \]
subject to:
\[y_i(\langle\vec{w}, \vec{x}_i \rangle + b) \geq 1 - \zeta_i \] \[\zeta_i \geq 0 for \space any \space i = 1, ... , m\]
Where \( m \) is a number of training samples, \( y_i \) is a label value (either 1 or -1) and \(\langle\vec{w}, \vec{x}_i \rangle + b\) is a decision boundary.
To solve this optimization problem, SmartCore uses an approximate SVM solver.
The optimizer reaches accuracies similar to that of a real SVM after performing two passes through the training examples. You can choose the number of passes
through the data that the algorithm takes by changing the epoch
parameter of the classifier.
Example:
use smartcore::linalg::naive::dense_matrix::*; use smartcore::svm::Kernels; use smartcore::svm::svc::{SVC, SVCParameters}; // Iris dataset let x = DenseMatrix::from_2d_array(&[ &[5.1, 3.5, 1.4, 0.2], &[4.9, 3.0, 1.4, 0.2], &[4.7, 3.2, 1.3, 0.2], &[4.6, 3.1, 1.5, 0.2], &[5.0, 3.6, 1.4, 0.2], &[5.4, 3.9, 1.7, 0.4], &[4.6, 3.4, 1.4, 0.3], &[5.0, 3.4, 1.5, 0.2], &[4.4, 2.9, 1.4, 0.2], &[4.9, 3.1, 1.5, 0.1], &[7.0, 3.2, 4.7, 1.4], &[6.4, 3.2, 4.5, 1.5], &[6.9, 3.1, 4.9, 1.5], &[5.5, 2.3, 4.0, 1.3], &[6.5, 2.8, 4.6, 1.5], &[5.7, 2.8, 4.5, 1.3], &[6.3, 3.3, 4.7, 1.6], &[4.9, 2.4, 3.3, 1.0], &[6.6, 2.9, 4.6, 1.3], &[5.2, 2.7, 3.9, 1.4], ]); let y = vec![ 0., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]; let svr = SVC::fit(&x, &y, SVCParameters::default().with_c(200.0)).unwrap(); let y_hat = svr.predict(&x).unwrap();
References:
Structs
SVC | Support Vector Classifier |
SVCParameters | SVC Parameters |