smartcore/lib.rs
1#![allow(
2 clippy::type_complexity,
3 clippy::too_many_arguments,
4 clippy::many_single_char_names,
5 clippy::unnecessary_wraps,
6 clippy::upper_case_acronyms,
7 clippy::approx_constant
8)]
9#![warn(missing_docs)]
10
11//! # smartcore
12//!
13//! Welcome to `smartcore`, machine learning in Rust!
14//!
15//! `smartcore` features various classification, regression and clustering algorithms including support vector machines, random forests, k-means and DBSCAN,
16//! as well as tools for model selection and model evaluation.
17//!
18//! `smartcore` provides its own traits system that extends Rust standard library, to deal with linear algebra and common
19//! computational models. Its API is designed using well recognizable patterns. Extra features (like support for [ndarray](https://docs.rs/ndarray)
20//! structures) is available via optional features.
21//!
22//! ## Getting Started
23//!
24//! To start using `smartcore` latest stable version simply add the following to your `Cargo.toml` file:
25//! ```ignore
26//! [dependencies]
27//! smartcore = "*"
28//! ```
29//!
30//! To start using smartcore development version with latest unstable additions:
31//! ```ignore
32//! [dependencies]
33//! smartcore = { git = "https://github.com/smartcorelib/smartcore", branch = "development" }
34//! ```
35//!
36//! There are different features that can be added to the base library, for example to add sample datasets:
37//! ```ignore
38//! [dependencies]
39//! smartcore = { git = "https://github.com/smartcorelib/smartcore", features = ["datasets"] }
40//! ```
41//! Check `smartcore`'s `Cargo.toml` for available features.
42//!
43//! ## Using Jupyter
44//! For quick introduction, Jupyter Notebooks are available [here](https://github.com/smartcorelib/smartcore-jupyter/tree/main/notebooks).
45//! You can set up a local environment to run Rust notebooks using [EVCXR](https://github.com/google/evcxr)
46//! following [these instructions](https://depth-first.com/articles/2020/09/21/interactive-rust-in-a-repl-and-jupyter-notebook-with-evcxr/).
47//!
48//!
49//! ## First Example
50//! For example, you can use this code to fit a [K Nearest Neighbors classifier](neighbors/knn_classifier/index.html) to a dataset that is defined as standard Rust vector:
51//!
52//! ```
53//! // DenseMatrix definition
54//! use smartcore::linalg::basic::matrix::DenseMatrix;
55//! // KNNClassifier
56//! use smartcore::neighbors::knn_classifier::*;
57//! // Various distance metrics
58//! use smartcore::metrics::distance::*;
59//!
60//! // Turn Rust vector-slices with samples into a matrix
61//! let x = DenseMatrix::from_2d_array(&[
62//! &[1., 2.],
63//! &[3., 4.],
64//! &[5., 6.],
65//! &[7., 8.],
66//! &[9., 10.]]).unwrap();
67//! // Our classes are defined as a vector
68//! let y = vec![2, 2, 2, 3, 3];
69//!
70//! // Train classifier
71//! let knn = KNNClassifier::fit(&x, &y, Default::default()).unwrap();
72//!
73//! // Predict classes
74//! let y_hat = knn.predict(&x).unwrap();
75//! ```
76//!
77//! ## Overview
78//!
79//! ### Supported algorithms
80//! All machine learning algorithms are grouped into these broad categories:
81//! * [Clustering](cluster/index.html), unsupervised clustering of unlabeled data.
82//! * [Matrix Decomposition](decomposition/index.html), various methods for matrix decomposition.
83//! * [Linear Models](linear/index.html), regression and classification methods where output is assumed to have linear relation to explanatory variables
84//! * [Ensemble Models](ensemble/index.html), variety of regression and classification ensemble models
85//! * [Tree-based Models](tree/index.html), classification and regression trees
86//! * [Nearest Neighbors](neighbors/index.html), K Nearest Neighbors for classification and regression
87//! * [Naive Bayes](naive_bayes/index.html), statistical classification technique based on Bayes Theorem
88//! * [SVM](svm/index.html), support vector machines
89//!
90//! ### Linear Algebra traits system
91//! For an introduction to `smartcore`'s traits system see [this notebook](https://github.com/smartcorelib/smartcore-jupyter/blob/5523993c53c6ec1fd72eea130ef4e7883121c1ea/notebooks/01-A-little-bit-about-numbers.ipynb)
92
93/// Foundamental numbers traits
94pub mod numbers;
95
96/// Various algorithms and helper methods that are used elsewhere in smartcore
97pub mod algorithm;
98pub mod api;
99
100/// Algorithms for clustering of unlabeled data
101pub mod cluster;
102/// Various datasets
103#[cfg(feature = "datasets")]
104pub mod dataset;
105/// Matrix decomposition algorithms
106pub mod decomposition;
107/// Ensemble methods, including Random Forest classifier and regressor
108pub mod ensemble;
109pub mod error;
110/// Diverse collection of linear algebra abstractions and methods that power smartcore algorithms
111pub mod linalg;
112/// Supervised classification and regression models that assume linear relationship between dependent and explanatory variables.
113pub mod linear;
114/// Functions for assessing prediction error.
115pub mod metrics;
116/// TODO: add docstring for model_selection
117pub mod model_selection;
118/// Supervised learning algorithms based on applying the Bayes theorem with the independence assumptions between predictors
119pub mod naive_bayes;
120/// Supervised neighbors-based learning methods
121pub mod neighbors;
122/// Optimization procedures
123pub mod optimization;
124/// Preprocessing utilities
125pub mod preprocessing;
126/// Reading in data from serialized formats
127#[cfg(feature = "serde")]
128pub mod readers;
129/// Support Vector Machines
130pub mod svm;
131/// Supervised tree-based learning methods
132pub mod tree;
133pub mod xgboost;
134
135pub(crate) mod rand_custom;