[][src]Crate tsuga

An early stage machine-learning library in Rust

tsuga is an early stage machine learning library in Rust for building neural networks. It uses ndarray as the linear algebra backend, and operates primarily on two-dimensional f32 arrays (Array2<f32> types). At the moment, it's primary function has been for testing out various ideas for APIs, as an educational exercise, and probably isn't yet suitable for serious use. Most of the project's focus so far has been on the image-processing domain, although the tools and layout should generally applicable to higher/lower-dimensional datasets as well.

Tsuga currently uses the Builder pattern for constructing fully-connected networks. Since networks are complex compound structures, this pattern helps to make the layout of the network explicit and modular.


Tsuga uses the minifb to display sample images during development, which means you may need to add certain dependencies via

$ sudo apt install libxkbcommon-dev libwayland-cursor0 libwayland-dev

MNIST Example

The following is a reduced-code example of building a network to train on/evaluate the MNIST (or Fashion MNIST) data set. Including unpacking the MNIST binary files, this network achieves:

  • An accuracy of ~91.5% over 1000 iterations in 3.65 seconds
  • An accuracy of ~97.1% over 10,000 iterations in 29.43 seconds
use ndarray::prelude::*;
use tsuga::prelude::*;

fn main() {
   // Reduced-version for importing the MNIST data and unpacking it into four Array2<f32> data structures
   let (input, output, test_input, test_output) = mnist_as_ndarray();
   println!("Successfully unpacked the MNIST dataset into Array2<f32> format!");

   // Now we can begin configuring any additional hidden layers, specifying their size and activation function
   // We could also use activation functions like "relu"
   let mut layers_cfg: Vec<FCLayer> = Vec::new();
   let sigmoid_layer_0 = FCLayer::new("sigmoid", 128);
   let sigmoid_layer_1 = FCLayer::new("sigmoid", 64);

   // The network can now be built using the specified layer configurations
   // Several other options for tuning the network's performance are available as well
   let mut fcn = FullyConnectedNetwork::default(input, output)

   // Training occurs in place on the network
   fcn.train().expect("An error occurred while training");

   // We can now pass an appropriately-sized input through our trained network,
   // receiving an Array2<f32> on the output
   let test_result = fcn.evaluate(test_input.clone());

   // And will compare that output against the ideal one-hot encoded testing label array
   compare_results(test_result.clone(), test_output);



Activation functions which can be applied element-wise or to subsets of the network's matrices


Definitions for convolutional layers


An unstable and immature module for chaining static-kernel sliding-window convolutions of input data


Definitions for fully-connected layers which compose the neural networks


Constructs, trains, and evaluates a neural network based on supplied input and output data


Contains all the necessary imports for building and training a basic neural network