smartcore_proba/
lib.rs

1#![allow(
2    clippy::type_complexity,
3    clippy::too_many_arguments,
4    clippy::many_single_char_names,
5    clippy::unnecessary_wraps,
6    clippy::upper_case_acronyms,
7    clippy::approx_constant
8)]
9#![warn(missing_docs)]
10#![warn(rustdoc::missing_doc_code_examples)]
11
12//! # smartcore
13//!
14//! Welcome to `smartcore`, machine learning in Rust!
15//!
16//! `smartcore` features various classification, regression and clustering algorithms including support vector machines, random forests, k-means and DBSCAN,
17//! as well as tools for model selection and model evaluation.
18//!
19//! `smartcore` provides its own traits system that extends Rust standard library, to deal with linear algebra and common
20//! computational models. Its API is designed using well recognizable patterns. Extra features (like support for [ndarray](https://docs.rs/ndarray)
21//! structures) is available via optional features.
22//!
23//! ## Getting Started
24//!
25//! To start using `smartcore` latest stable version simply add the following to your `Cargo.toml` file:
26//! ```ignore
27//! [dependencies]
28//! smartcore = "*"
29//! ```
30//!
31//! To start using smartcore development version with latest unstable additions:
32//! ```ignore
33//! [dependencies]
34//! smartcore = { git = "https://github.com/smartcorelib/smartcore", branch = "development" }
35//! ```
36//!
37//! There are different features that can be added to the base library, for example to add sample datasets:
38//! ```ignore
39//! [dependencies]
40//! smartcore = { git = "https://github.com/smartcorelib/smartcore", features = ["datasets"] }
41//! ```
42//! Check `smartcore`'s `Cargo.toml` for available features.
43//!
44//! ## Using Jupyter
45//! For quick introduction, Jupyter Notebooks are available [here](https://github.com/smartcorelib/smartcore-jupyter/tree/main/notebooks).
46//! You can set up a local environment to run Rust notebooks using [EVCXR](https://github.com/google/evcxr)
47//! following [these instructions](https://depth-first.com/articles/2020/09/21/interactive-rust-in-a-repl-and-jupyter-notebook-with-evcxr/).
48//!
49//!
50//! ## First Example
51//! For example, you can use this code to fit a [K Nearest Neighbors classifier](neighbors/knn_classifier/index.html) to a dataset that is defined as standard Rust vector:
52//!
53//! ```
54//! // DenseMatrix definition
55//! use smartcore::linalg::basic::matrix::DenseMatrix;
56//! // KNNClassifier
57//! use smartcore::neighbors::knn_classifier::*;
58//! // Various distance metrics
59//! use smartcore::metrics::distance::*;
60//!
61//! // Turn Rust vector-slices with samples into a matrix
62//! let x = DenseMatrix::from_2d_array(&[
63//!    &[1., 2.],
64//!    &[3., 4.],
65//!    &[5., 6.],
66//!    &[7., 8.],
67//!    &[9., 10.]]).unwrap();
68//! // Our classes are defined as a vector
69//! let y = vec![2, 2, 2, 3, 3];
70//!
71//! // Train classifier
72//! let knn = KNNClassifier::fit(&x, &y, Default::default()).unwrap();
73//!
74//! // Predict classes
75//! let y_hat = knn.predict(&x).unwrap();
76//! ```
77//!
78//! ## Overview
79//!
80//! ### Supported algorithms
81//! All machine learning algorithms are grouped into these broad categories:
82//! * [Clustering](cluster/index.html), unsupervised clustering of unlabeled data.
83//! * [Matrix Decomposition](decomposition/index.html), various methods for matrix decomposition.
84//! * [Linear Models](linear/index.html), regression and classification methods where output is assumed to have linear relation to explanatory variables
85//! * [Ensemble Models](ensemble/index.html), variety of regression and classification ensemble models
86//! * [Tree-based Models](tree/index.html), classification and regression trees
87//! * [Nearest Neighbors](neighbors/index.html), K Nearest Neighbors for classification and regression
88//! * [Naive Bayes](naive_bayes/index.html), statistical classification technique based on Bayes Theorem
89//! * [SVM](svm/index.html), support vector machines
90//!
91//! ### Linear Algebra traits system
92//! For an introduction to `smartcore`'s traits system see [this notebook](https://github.com/smartcorelib/smartcore-jupyter/blob/5523993c53c6ec1fd72eea130ef4e7883121c1ea/notebooks/01-A-little-bit-about-numbers.ipynb)
93
94/// Foundamental numbers traits
95pub mod numbers;
96
97/// Various algorithms and helper methods that are used elsewhere in smartcore
98pub mod algorithm;
99pub mod api;
100
101/// Algorithms for clustering of unlabeled data
102pub mod cluster;
103/// Various datasets
104#[cfg(feature = "datasets")]
105pub mod dataset;
106/// Matrix decomposition algorithms
107pub mod decomposition;
108/// Ensemble methods, including Random Forest classifier and regressor
109pub mod ensemble;
110pub mod error;
111/// Diverse collection of linear algebra abstractions and methods that power smartcore algorithms
112pub mod linalg;
113/// Supervised classification and regression models that assume linear relationship between dependent and explanatory variables.
114pub mod linear;
115/// Functions for assessing prediction error.
116pub mod metrics;
117/// TODO: add docstring for model_selection
118pub mod model_selection;
119///  Supervised learning algorithms based on applying the Bayes theorem with the independence assumptions between predictors
120pub mod naive_bayes;
121/// Supervised neighbors-based learning methods
122pub mod neighbors;
123/// Optimization procedures
124pub mod optimization;
125/// Preprocessing utilities
126pub mod preprocessing;
127/// Reading in data from serialized formats
128#[cfg(feature = "serde")]
129pub mod readers;
130/// Support Vector Machines
131pub mod svm;
132/// Supervised tree-based learning methods
133pub mod tree;
134
135pub(crate) mod rand_custom;