sklears-svm
High-performance Support Vector Machine implementations for Rust with advanced kernels and optimization algorithms, delivering 5-15x speedup over scikit-learn.
Latest release:
0.1.0-beta.1(January 1, 2026). See the workspace release notes for highlights and upgrade guidance.
Overview
sklears-svm provides comprehensive SVM implementations:
- Core Algorithms: SVC, SVR, LinearSVC, NuSVC, NuSVR
- Kernel Functions: Linear, RBF, Polynomial, Sigmoid, Custom kernels
- Optimization: SMO, coordinate descent, stochastic gradient descent
- Advanced Features: Multi-class strategies, probability calibration, online learning
- Performance: SIMD optimization, sparse data support, optional CUDA/WebGPU acceleration
Quick Start
use ;
use array;
// Classification with RBF kernel
let svc = SVCbuilder
.kernel
.C
.probability
.build;
// Regression with polynomial kernel
let svr = SVRbuilder
.kernel
.epsilon
.build;
// Linear SVM for large-scale problems
let linear_svc = builder
.penalty
.loss
.dual // Primal optimization for n_samples >> n_features
.build;
// Train and predict
let X = array!;
let y = array!;
let fitted = svc.fit?;
let predictions = fitted.predict?;
let probabilities = fitted.predict_proba?;
Advanced Features
Custom Kernels
use ;
// Define custom kernel function
let custom_kernel = new;
let svc = SVCbuilder
.kernel
.build;
Multi-class Strategies
use ;
// One-vs-Rest strategy
let svc_ovr = SVCbuilder
.multi_class
.build;
// One-vs-One strategy (default for SVC)
let svc_ovo = SVCbuilder
.multi_class
.build;
// Crammer-Singer multi-class
let svc_cs = builder
.multi_class
.build;
Online Learning
use SGDClassifier;
let mut sgd = builder
.loss
.learning_rate
.build;
// Incremental learning
for batch in data_stream
Probability Calibration
use ;
let svc = SVCbuilder
.probability
.calibration_method
.build;
let fitted = svc.fit?;
let calibrated_probs = fitted.predict_proba?;
Performance Features
Sparse Data Support
use SparseSVC;
use CsMat;
let sparse_X = from_dense;
let sparse_svc = builder
.kernel
.build;
let fitted = sparse_svc.fit?;
Parallel Training
let svc = SVCbuilder
.kernel
.n_jobs // Use 4 threads
.cache_size // MB for kernel cache
.build;
Optimization Strategies
use ;
// SMO with shrinking heuristics
let svc_smo = SVCbuilder
.solver
.shrinking
.build;
// Coordinate descent for linear SVM
let linear_svc_cd = builder
.solver
.build;
// Stochastic gradient descent
let sgd_svm = builder
.alpha
.max_iter
.build;
Advanced Algorithms
Nu-Support Vector Machines
use ;
// Nu-SVC with automatic margin
let nu_svc = builder
.nu // Upper bound on fraction of margin errors
.kernel
.build;
// Nu-SVR for regression
let nu_svr = builder
.nu
.kernel
.build;
One-Class SVM
use OneClassSVM;
// Anomaly detection
let oc_svm = builder
.nu // Expected fraction of outliers
.kernel
.build;
let fitted = oc_svm.fit?;
let anomaly_scores = fitted.decision_function?;
Kernel Approximation
use ;
// Nystroem approximation for large-scale kernels
let nystroem = builder
.kernel
.n_components
.build;
// Random Fourier features
let rbf_sampler = builder
.gamma
.n_components
.build;
// Use with linear SVM for speed
let X_transformed = nystroem.fit_transform?;
let linear_svc = default;
let fitted = linear_svc.fit?;
Benchmarks
Performance comparisons:
| Algorithm | scikit-learn | sklears-svm | Speedup |
|---|---|---|---|
| Linear SVC | 45ms | 5ms | 9x |
| RBF SVC | 120ms | 15ms | 8x |
| Nu-SVC | 135ms | 18ms | 7.5x |
| SGD Classifier | 8ms | 0.8ms | 10x |
Architecture
sklears-svm/
├── core/ # Core SVM algorithms
├── kernels/ # Kernel implementations
├── solvers/ # Optimization algorithms
├── multiclass/ # Multi-class strategies
├── online/ # Incremental learning
├── sparse/ # Sparse data support
└── gpu/ # GPU acceleration (WIP)
Status
- Core Algorithms: 90% complete
- Kernel Functions: All major kernels implemented
- Optimization: SMO, CD, SGD implemented
- GPU Support: In development
Contributing
Priority areas for contribution:
- GPU kernel computations
- Additional kernel functions
- Performance optimizations
- Cross-validation utilities
See CONTRIBUTING.md for guidelines.
License
Licensed under either of:
- Apache License, Version 2.0
- MIT license
Citation