ferrolearn-decomp
Dimensionality reduction and matrix decomposition for the ferrolearn machine learning framework.
Algorithms
Linear methods
| Model | Description |
|---|---|
PCA |
Principal Component Analysis — project onto directions of maximum variance |
IncrementalPCA |
Incremental PCA for large datasets that don't fit in memory |
TruncatedSVD |
Randomized SVD (Halko algorithm) — works on uncentered/sparse data |
NMF |
Non-negative Matrix Factorization (coordinate descent and multiplicative update solvers) |
FactorAnalysis |
Factor Analysis via EM algorithm |
FastICA |
Independent Component Analysis |
Manifold learning
| Model | Description |
|---|---|
KernelPCA |
Non-linear PCA via RBF, polynomial, or sigmoid kernels |
Isomap |
Isometric mapping via geodesic distances on a kNN graph |
MDS |
Classical Multidimensional Scaling |
SpectralEmbedding |
Laplacian Eigenmaps |
LLE |
Locally Linear Embedding |
Example
use PCA;
use ;
use array;
let x = array!;
let pca = PCA::new;
let fitted = pca.fit.unwrap;
let projected = fitted.transform.unwrap;
assert_eq!;
// Inspect explained variance
let variance_ratio = fitted.explained_variance_ratio;
License
Licensed under either of Apache License, Version 2.0 or MIT License at your option.