Expand description
Kernel Principal Component Analysis (Kernel PCA).
KernelPCA performs non-linear dimensionality reduction by first mapping
data into a higher-dimensional (possibly infinite-dimensional) feature space
via a kernel function, then performing standard PCA in that space.
§Kernels
- Linear:
K(x, y) = x . y(equivalent to standard PCA) - RBF (Gaussian):
K(x, y) = exp(-gamma * ||x - y||^2) - Polynomial:
K(x, y) = (gamma * x . y + coef0)^degree - Sigmoid:
K(x, y) = tanh(gamma * x . y + coef0)
§Algorithm
- Compute the kernel matrix
Kof shape(n_samples, n_samples). - Centre
Kin feature space:K_c = K - 1_n K - K 1_n + 1_n K 1_nwhere1_nis the(n, n)matrix with all entries1/n. - Eigendecompose
K_cusing the Jacobi iterative method. - Sort eigenvalues descending and retain the top
n_components. - Scale eigenvectors by
1 / sqrt(eigenvalue).
§Examples
use ferrolearn_decomp::{KernelPCA, Kernel};
use ferrolearn_core::traits::{Fit, Transform};
use ndarray::array;
let kpca = KernelPCA::<f64>::new(2).with_kernel(Kernel::RBF);
let x = array![
[1.0, 2.0],
[3.0, 4.0],
[5.0, 6.0],
[7.0, 8.0],
[9.0, 10.0],
];
let fitted = kpca.fit(&x, &()).unwrap();
let projected = fitted.transform(&x).unwrap();
assert_eq!(projected.ncols(), 2);Structs§
- Fitted
KernelPCA - A fitted Kernel PCA model holding learned eigendecomposition.
- KernelPCA
- Kernel PCA configuration.
Enums§
- Kernel
- The kernel function for Kernel PCA.