rkhs
Kernel methods.
Dual-licensed under MIT or Apache-2.0.
Quickstart
[]
= "0.1.3"
use ;
let x = vec!;
let y = vec!;
// MMD: kernel distance between distributions
let mmd = mmd_unbiased;
// Permutation test for significance
let = mmd_permutation_test;
Functions
| Function | Purpose |
|---|---|
rbf |
Gaussian/RBF kernel |
polynomial |
Polynomial kernel |
kernel_matrix |
n x n Gram matrix |
mmd_biased |
Biased MMD estimate |
mmd_unbiased |
Unbiased MMD U-statistic |
mmd_permutation_test |
Two-sample test with p-value |
median_bandwidth |
Bandwidth selection heuristic |
energy_lse |
Log-Sum-Exp energy (Dense AM with RBF) |
energy_lsr |
Log-Sum-ReLU energy (Dense AM with Epanechnikov) |
retrieve_memory |
Memory retrieval via energy descent |
Why MMD
MMD (Maximum Mean Discrepancy) measures distance between distributions using kernel mean embeddings. Given samples from P and Q, it tests whether P = Q.
- Two-sample testing (detect distribution shift)
- Domain adaptation (minimize source/target divergence)
- GAN evaluation
- Model criticism
Why "rkhs"
Every positive-definite kernel k(x,y) uniquely defines a Reproducing Kernel Hilbert Space (Moore-Aronszajn theorem). MMD, kernel PCA, SVM, Gaussian processes -- all operate in this space. The name reflects the unifying structure.