Module smartcore::linear::logistic_regression
source · [−]Expand description
Logistic Regression
As Linear Regression, logistic regression explains your outcome as a linear combination of predictor variables \(X\) but rather than modeling this response directly, logistic regression models the probability that \(y\) belongs to a particular category, \(Pr(y = 1|X) \), as:
\[ Pr(y=1) \approx \frac{e^{\beta_0 + \sum_{i=1}^n \beta_iX_i}}{1 + e^{\beta_0 + \sum_{i=1}^n \beta_iX_i}} \]
SmartCore uses limited memory BFGS method to find estimates of regression coefficients, \(\beta\)
Example:
use smartcore::linalg::naive::dense_matrix::*;
use smartcore::linear::logistic_regression::*;
//Iris data
let x = DenseMatrix::from_2d_array(&[
&[5.1, 3.5, 1.4, 0.2],
&[4.9, 3.0, 1.4, 0.2],
&[4.7, 3.2, 1.3, 0.2],
&[4.6, 3.1, 1.5, 0.2],
&[5.0, 3.6, 1.4, 0.2],
&[5.4, 3.9, 1.7, 0.4],
&[4.6, 3.4, 1.4, 0.3],
&[5.0, 3.4, 1.5, 0.2],
&[4.4, 2.9, 1.4, 0.2],
&[4.9, 3.1, 1.5, 0.1],
&[7.0, 3.2, 4.7, 1.4],
&[6.4, 3.2, 4.5, 1.5],
&[6.9, 3.1, 4.9, 1.5],
&[5.5, 2.3, 4.0, 1.3],
&[6.5, 2.8, 4.6, 1.5],
&[5.7, 2.8, 4.5, 1.3],
&[6.3, 3.3, 4.7, 1.6],
&[4.9, 2.4, 3.3, 1.0],
&[6.6, 2.9, 4.6, 1.3],
&[5.2, 2.7, 3.9, 1.4],
]);
let y: Vec<f64> = vec![
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
];
let lr = LogisticRegression::fit(&x, &y, Default::default()).unwrap();
let y_hat = lr.predict(&x).unwrap();
References:
- “Pattern Recognition and Machine Learning”, C.M. Bishop, Linear Models for Classification
- “An Introduction to Statistical Learning”, James G., Witten D., Hastie T., Tibshirani R., 4.3 Logistic Regression
- “On the Limited Memory Method for Large Scale Optimization”, Nocedal et al., Mathematical Programming, 1989
Structs
Logistic Regression
Logistic Regression parameters
Enums
Solver options for Logistic regression. Right now only LBFGS solver is supported.