Skip to main content

Crate ferrolearn_decomp

Crate ferrolearn_decomp 

Source
Expand description

§ferrolearn-decomp

Dimensionality reduction and matrix decomposition for the ferrolearn machine learning framework.

This crate provides PCA, TruncatedSVD, NMF, Kernel PCA, and manifold learning methods that follow the ferrolearn Fit/Transform trait pattern.

§Algorithms

  • PCA — Principal Component Analysis. Centres data and projects onto the directions of maximum variance.
  • TruncatedSVD — Truncated Singular Value Decomposition using the randomized algorithm. Does not centre data, making it suitable for sparse inputs.
  • NMF — Non-negative Matrix Factorization. Decomposes a non-negative matrix X into W * H where both factors are non-negative.
  • KernelPCA — Kernel PCA. Non-linear dimensionality reduction via a kernel-induced feature space.
  • MDS — Classical Multidimensional Scaling. Embeds data preserving pairwise distances.
  • Isomap — Isometric Mapping. Non-linear dimensionality reduction via geodesic distances on a kNN graph.
  • SpectralEmbedding — Laplacian Eigenmaps. Non-linear dimensionality reduction via the normalised graph Laplacian.
  • LLE — Locally Linear Embedding. Non-linear dimensionality reduction preserving local reconstruction weights.
  • Tsne — t-distributed Stochastic Neighbor Embedding. Non-linear dimensionality reduction using Barnes-Hut approximation.
  • Umap — Uniform Manifold Approximation and Projection. Fast non-linear dimensionality reduction based on topological data analysis.
  • LatentDirichletAllocation — Latent Dirichlet Allocation topic model. Discovers latent topics in document-term matrices.
  • DictionaryLearning — Sparse coding with a learned dictionary.

§Pipeline Integration

PCA<f64>, TruncatedSVD<f64>, NMF<f64>, and KernelPCA<f64> all implement PipelineTransformer so they can be used as transformer steps in a Pipeline.

§Examples

use ferrolearn_decomp::PCA;
use ferrolearn_core::traits::{Fit, Transform};
use ndarray::array;

let pca = PCA::<f64>::new(1);
let x = array![[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]];
let fitted = pca.fit(&x, &()).unwrap();
let projected = fitted.transform(&x).unwrap();
assert_eq!(projected.ncols(), 1);

Re-exports§

pub use covariance::EllipticEnvelope;
pub use covariance::EmpiricalCovariance;
pub use covariance::FittedCovariance;
pub use covariance::FittedEllipticEnvelope;
pub use covariance::FittedLedoitWolf;
pub use covariance::FittedMinCovDet;
pub use covariance::FittedOAS;
pub use covariance::LedoitWolf;
pub use covariance::MinCovDet;
pub use covariance::ShrunkCovariance;
pub use covariance::OAS;
pub use cross_decomposition::CCA;
pub use cross_decomposition::FittedCCA;
pub use cross_decomposition::FittedPLSCanonical;
pub use cross_decomposition::FittedPLSRegression;
pub use cross_decomposition::FittedPLSSVD;
pub use cross_decomposition::PLSCanonical;
pub use cross_decomposition::PLSRegression;
pub use cross_decomposition::PLSSVD;
pub use dictionary_learning::DictFitAlgorithm;
pub use dictionary_learning::DictTransformAlgorithm;
pub use dictionary_learning::DictionaryLearning;
pub use dictionary_learning::FittedDictionaryLearning;
pub use factor_analysis::FactorAnalysis;
pub use factor_analysis::FittedFactorAnalysis;
pub use fast_ica::Algorithm;
pub use fast_ica::FastICA;
pub use fast_ica::FittedFastICA;
pub use fast_ica::NonLinearity;
pub use incremental_pca::FittedIncrementalPCA;
pub use incremental_pca::IncrementalPCA;
pub use isomap::FittedIsomap;
pub use isomap::Isomap;
pub use kernel_pca::FittedKernelPCA;
pub use kernel_pca::Kernel;
pub use kernel_pca::KernelPCA;
pub use lda_topic::FittedLatentDirichletAllocation;
pub use lda_topic::LatentDirichletAllocation;
pub use lda_topic::LdaLearningMethod;
pub use lle::FittedLLE;
pub use lle::LLE;
pub use mds::Dissimilarity;
pub use mds::FittedMDS;
pub use mds::MDS;
pub use nmf::FittedNMF;
pub use nmf::NMF;
pub use nmf::NMFInit;
pub use nmf::NMFSolver;
pub use pca::FittedPCA;
pub use pca::PCA;
pub use spectral_embedding::Affinity;
pub use spectral_embedding::FittedSpectralEmbedding;
pub use spectral_embedding::SpectralEmbedding;
pub use truncated_svd::FittedTruncatedSVD;
pub use truncated_svd::TruncatedSVD;
pub use tsne::FittedTsne;
pub use tsne::Tsne;
pub use umap::FittedUmap;
pub use umap::Umap;
pub use umap::UmapMetric;

Modules§

covariance
Covariance estimation.
cross_decomposition
Cross-decomposition methods: PLS, CCA, and PLSSVD.
dictionary_learning
Dictionary Learning.
factor_analysis
Factor Analysis (FA) via the EM algorithm.
fast_ica
Fast Independent Component Analysis (FastICA).
incremental_pca
Incremental Principal Component Analysis (IncrementalPCA).
isomap
Isomap (Isometric Mapping).
kernel_pca
Kernel Principal Component Analysis (Kernel PCA).
lda_topic
Latent Dirichlet Allocation (LDA) topic model.
lle
Locally Linear Embedding (LLE).
mds
Multidimensional Scaling (MDS).
nmf
Non-negative Matrix Factorization (NMF).
pca
Principal Component Analysis (PCA).
spectral_embedding
Spectral Embedding (Laplacian Eigenmaps).
truncated_svd
Truncated Singular Value Decomposition.
tsne
t-distributed Stochastic Neighbor Embedding (t-SNE).
umap
Uniform Manifold Approximation and Projection (UMAP).