Crate rusty_machine

Source
Expand description

§The rusty-machine crate.

A crate built for machine learning that works out-of-the-box.


§Structure

The crate is made up of two primary modules: learning and linalg.

§learning

The learning module contains all of the machine learning modules. This means the algorithms, models and related tools.

The currently supported techniques are:

  • Linear Regression
  • Logistic Regression
  • Generalized Linear Models
  • K-Means Clustering
  • Neural Networks
  • Gaussian Process Regression
  • Support Vector Machines
  • Gaussian Mixture Models
  • Naive Bayes Classifiers
  • DBSCAN

§linalg

The linalg module reexports some structs and traits from the rulinalg crate. This is to provide easy access to common linear algebra tools within this library.


§Usage

Specific usage of modules is described within the modules themselves. This section will focus on the general workflow for this library.

The models contained within the learning module should implement either SupModel or UnSupModel. These both provide a train and a predict function which provide an interface to the model.

You should instantiate the model, with your chosen options and then train using the training data. Followed by predicting with your test data. For now cross-validation, data handling, and many other things are left explicitly to the user.

Here is an example usage for Gaussian Process Regression:

use rusty_machine::linalg::Matrix;
use rusty_machine::linalg::Vector;
use rusty_machine::learning::gp::GaussianProcess;
use rusty_machine::learning::gp::ConstMean;
use rusty_machine::learning::toolkit::kernel;
use rusty_machine::learning::SupModel;

// First we'll get some data.

// Some example training data.
let inputs = Matrix::new(3,3,vec![1.,1.,1.,2.,2.,2.,3.,3.,3.]);
let targets = Vector::new(vec![0.,1.,0.]);

// Some example test data.
let test_inputs = Matrix::new(2,3, vec![1.5,1.5,1.5,2.5,2.5,2.5]);

// Now we'll set up our model.
// This is close to the most complicated a model in rusty-machine gets!

// A squared exponential kernel with lengthscale 2, and amplitude 1.
let ker = kernel::SquaredExp::new(2., 1.);

// The zero function
let zero_mean = ConstMean::default();

// Construct a GP with the specified kernel, mean, and a noise of 0.5.
let mut gp = GaussianProcess::new(ker, zero_mean, 0.5);


// Now we can train and predict from the model.

// Train the model!
gp.train(&inputs, &targets).unwrap();

// Predict output from test datae]
let outputs = gp.predict(&test_inputs).unwrap();

This code could have been a lot simpler if we had simply adopted let mut gp = GaussianProcess::default();. Conversely, you could also implement your own kernels and mean functions by using the appropriate traits.

Additionally you’ll notice there’s quite a few use statements at the top of this code. We can remove some of these by utilizing the prelude:

use rusty_machine::prelude::*;

let _ = Matrix::new(2,2,vec![2.0;4]);

Modules§

  • Module for evaluating models.
  • Module for data handling
  • Module for machine learning.
  • The linear algebra module
  • The rusty-machine prelude.