[−][src]Module smartcore::svm::svr
Epsilon-Support Vector Regression.
Support Vector Regression (SVR) is a popular algorithm used for regression that uses the same principle as SVM.
Just like SVC SVR finds optimal decision boundary, \(f(x)\) that separates all training instances with the largest margin. Unlike SVC, in \(\epsilon\)-SVR regression the goal is to find a function \(f(x)\) that has at most \(\epsilon\) deviation from the known targets \(y_i\) for all the training data. To find this function, we need to find solution to this optimization problem:
\[\underset{w, \zeta}{minimize} \space \space \frac{1}{2} \lVert \vec{w} \rVert^2 + C\sum_{i=1}^m \zeta_i \]
subject to:
\[\lvert y_i - \langle\vec{w}, \vec{x}_i \rangle - b \rvert \leq \epsilon + \zeta_i \] \[\lvert \langle\vec{w}, \vec{x}_i \rangle + b - y_i \rvert \leq \epsilon + \zeta_i \] \[\zeta_i \geq 0 for \space any \space i = 1, ... , m\]
Where \( m \) is a number of training samples, \( y_i \) is a target value and \(\langle\vec{w}, \vec{x}_i \rangle + b\) is a decision boundary.
The parameter C
> 0 determines the trade-off between the flatness of \(f(x)\) and the amount up to which deviations larger than \(\epsilon\) are tolerated
Example:
use smartcore::linalg::naive::dense_matrix::*; use smartcore::linear::linear_regression::*; use smartcore::svm::*; use smartcore::svm::svr::{SVR, SVRParameters}; // Longley dataset (https://www.statsmodels.org/stable/datasets/generated/longley.html) let x = DenseMatrix::from_2d_array(&[ &[234.289, 235.6, 159.0, 107.608, 1947., 60.323], &[259.426, 232.5, 145.6, 108.632, 1948., 61.122], &[258.054, 368.2, 161.6, 109.773, 1949., 60.171], &[284.599, 335.1, 165.0, 110.929, 1950., 61.187], &[328.975, 209.9, 309.9, 112.075, 1951., 63.221], &[346.999, 193.2, 359.4, 113.270, 1952., 63.639], &[365.385, 187.0, 354.7, 115.094, 1953., 64.989], &[363.112, 357.8, 335.0, 116.219, 1954., 63.761], &[397.469, 290.4, 304.8, 117.388, 1955., 66.019], &[419.180, 282.2, 285.7, 118.734, 1956., 67.857], &[442.769, 293.6, 279.8, 120.445, 1957., 68.169], &[444.546, 468.1, 263.7, 121.950, 1958., 66.513], &[482.704, 381.3, 255.2, 123.366, 1959., 68.655], &[502.601, 393.1, 251.4, 125.368, 1960., 69.564], &[518.173, 480.6, 257.2, 127.852, 1961., 69.331], &[554.894, 400.7, 282.7, 130.081, 1962., 70.551], ]); let y: Vec<f64> = vec![83.0, 88.5, 88.2, 89.5, 96.2, 98.1, 99.0, 100.0, 101.2, 104.6, 108.4, 110.8, 112.6, 114.2, 115.7, 116.9]; let svr = SVR::fit(&x, &y, SVRParameters::default().with_eps(2.0).with_c(10.0)).unwrap(); let y_hat = svr.predict(&x).unwrap();
References:
- "Support Vector Machines", Kowalczyk A., 2017
- "A Fast Algorithm for Training Support Vector Machines", Platt J.C., 1998
- "Working Set Selection Using Second Order Information for Training Support Vector Machines", Rong-En Fan et al., 2005
- "A tutorial on support vector regression", Smola A.J., Scholkopf B., 2003
Structs
SVR | Epsilon-Support Vector Regression |
SVRParameters | SVR Parameters |