Skip to main content

Module kernel_pca

Module kernel_pca 

Source
Expand description

Kernel Principal Component Analysis (Kernel PCA).

KernelPCA performs non-linear dimensionality reduction by first mapping data into a higher-dimensional (possibly infinite-dimensional) feature space via a kernel function, then performing standard PCA in that space.

§Kernels

  • Linear: K(x, y) = x . y (equivalent to standard PCA)
  • RBF (Gaussian): K(x, y) = exp(-gamma * ||x - y||^2)
  • Polynomial: K(x, y) = (gamma * x . y + coef0)^degree
  • Sigmoid: K(x, y) = tanh(gamma * x . y + coef0)

§Algorithm

  1. Compute the kernel matrix K of shape (n_samples, n_samples).
  2. Centre K in feature space: K_c = K - 1_n K - K 1_n + 1_n K 1_n where 1_n is the (n, n) matrix with all entries 1/n.
  3. Eigendecompose K_c using the Jacobi iterative method.
  4. Sort eigenvalues descending and retain the top n_components.
  5. Scale eigenvectors by 1 / sqrt(eigenvalue).

§Examples

use ferrolearn_decomp::{KernelPCA, Kernel};
use ferrolearn_core::traits::{Fit, Transform};
use ndarray::array;

let kpca = KernelPCA::<f64>::new(2).with_kernel(Kernel::RBF);
let x = array![
    [1.0, 2.0],
    [3.0, 4.0],
    [5.0, 6.0],
    [7.0, 8.0],
    [9.0, 10.0],
];
let fitted = kpca.fit(&x, &()).unwrap();
let projected = fitted.transform(&x).unwrap();
assert_eq!(projected.ncols(), 2);

Structs§

FittedKernelPCA
A fitted Kernel PCA model holding learned eigendecomposition.
KernelPCA
Kernel PCA configuration.

Enums§

Kernel
The kernel function for Kernel PCA.