[][src]Crate smartcore


Welcome to SmartCore, the most advanced machine learning library in Rust!

SmartCore features various classification, regression and clustering algorithms including support vector machines, random forests, k-means and DBSCAN, as well as tools for model selection and model evaluation.

SmartCore is well integrated with a with wide variaty of libraries that provide support for large, multi-dimensional arrays and matrices. At this moment, all Smartcore's algorithms work with ordinary Rust vectors, as well as matrices and vectors defined in these packages:

Getting Started

To start using SmartCore simply add the following to your Cargo.toml file:

smartcore = "0.2.0"

All machine learning algorithms in SmartCore are grouped into these broad categories:

  • Clustering, unsupervised clustering of unlabeled data.
  • Martix Decomposition, various methods for matrix decomposition.
  • Linear Models, regression and classification methods where output is assumed to have linear relation to explanatory variables
  • Ensemble Models, variety of regression and classification ensemble models
  • Tree-based Models, classification and regression trees
  • Nearest Neighbors, K Nearest Neighbors for classification and regression
  • Naive Bayes, statistical classification technique based on Bayes Theorem
  • SVM, support vector machines

For example, you can use this code to fit a K Nearest Neighbors classifier to a dataset that is defined as standard Rust vector:

// DenseMatrix defenition
use smartcore::linalg::naive::dense_matrix::*;
// KNNClassifier
use smartcore::neighbors::knn_classifier::*;
// Various distance metrics
use smartcore::math::distance::*;

// Turn Rust vectors with samples into a matrix
let x = DenseMatrix::from_2d_array(&[
   &[1., 2.],
   &[3., 4.],
   &[5., 6.],
   &[7., 8.],
   &[9., 10.]]);
// Our classes are defined as a Vector
let y = vec![2., 2., 2., 3., 3.];

// Train classifier
let knn = KNNClassifier::fit(&x, &y, Default::default()).unwrap();

// Predict classes
let y_hat = knn.predict(&x).unwrap();



Various algorithms and helper methods that are used elsewhere in SmartCore


Common Interfaces and API


Algorithms for clustering of unlabeled data


Various datasets Datasets


Matrix decomposition algorithms


Ensemble methods, including Random Forest classifier and regressor


Custom warnings and errors


Diverse collection of linear algebra abstractions and methods that power SmartCore algorithms


Supervised classification and regression models that assume linear relationship between dependent and explanatory variables.


Helper methods and classes, including definitions of distance metrics


Functions for assessing prediction error.


Model Selection methods


Supervised learning algorithms based on applying the Bayes theorem with the independence assumptions between predictors


Supervised neighbors-based learning methods


Support Vector Machines


Supervised tree-based learning methods