Skip to main content

ridge_regression

Function ridge_regression 

Source
pub fn ridge_regression<F>(
    x: &ArrayView2<'_, F>,
    y: &ArrayView1<'_, F>,
    alpha: Option<F>,
    fit_intercept: Option<bool>,
    normalize: Option<bool>,
    tol: Option<F>,
    max_iter: Option<usize>,
    conf_level: Option<F>,
) -> StatsResult<RegressionResults<F>>
where F: Float + Sum<F> + Div<Output = F> + Debug + Display + 'static + NumAssign + One + ScalarOperand + Send + Sync,
Expand description

Perform ridge regression (L2 regularization).

Ridge regression adds a penalty term to the sum of squared residuals, which can help reduce overfitting and handle multicollinearity.

§Arguments

  • x - Independent variables (design matrix)
  • y - Dependent variable
  • alpha - Regularization strength (default: 1.0)
  • fit_intercept - Whether to fit an intercept term (default: true)
  • normalize - Whether to normalize the data before fitting (default: false)
  • tol - Convergence tolerance (default: 1e-4)
  • max_iter - Maximum number of iterations (default: 1000)
  • conf_level - Confidence level for confidence intervals (default: 0.95)

§Returns

A RegressionResults struct with the regression results.

§Examples

use scirs2_core::ndarray::{array, Array2};
use scirs2_stats::ridge_regression;

// Create a design matrix with 3 variables
let x = Array2::from_shape_vec((5, 3), vec![
    1.0, 2.0, 3.0,
    2.0, 3.0, 4.0,
    3.0, 4.0, 5.0,
    4.0, 5.0, 6.0,
    5.0, 6.0, 7.0,
]).expect("Operation failed");

// Target values
let y = array![10.0, 15.0, 20.0, 25.0, 30.0];

// Perform ridge regression with alpha=0.1
let result = ridge_regression(&x.view(), &y.view(), Some(0.1), None, None, None, None, None).expect("Operation failed");

// Check that we get some coefficients
assert!(result.coefficients.len() > 0);