Crate neuroflow

source ·
Expand description

NeuroFlow is neural networks (and deep learning of course) Rust crate. It relies on three pillars: speed, reliability, and speed again.

Let’s better check some examples.

Examples

Here we are going to approximate very simple function 0.5*sin(e^x) - cos(e^(-x)).


use neuroflow::FeedForward;
use neuroflow::data::DataSet;
use neuroflow::activators::Type::Tanh;


 /*
     Define neural network with 1 neuron in input layers. Network contains 4 hidden layers.
     And, such as our function returns single value, it is reasonable to have 1 neuron in
     the output layer.
 */
 let mut nn = FeedForward::new(&[1, 7, 8, 8, 7, 1]);

 /*
     Define DataSet.

     DataSet is the Type that significantly simplifies work with neural network.
     Majority of its functionality is still under development :(
 */
 let mut data: DataSet = DataSet::new();
 let mut i = -3.0;

 // Push the data to DataSet (method push accepts two slices: input data and expected output)
 while i <= 2.5 {
     data.push(&[i], &[0.5*(i.exp().sin()) - (-i.exp()).cos()]);
     i += 0.05;
 }

 // Here, we set necessary parameters and train neural network
 // by our DataSet with 50 000 iterations
 nn.activation(Tanh)
     .learning_rate(0.01)
     .train(&data, 50_000);

 let mut res;

 // Let's check the result
 i = 0.0;
 while i <= 0.3{
     res = nn.calc(&[i])[0];
     println!("for [{:.3}], [{:.3}] -> [{:.3}]", i, 0.5*(i.exp().sin()) - (-i.exp()).cos(), res);
     i += 0.07;
 }

You don’t need to lose your so hardly trained network, my friend! For those there are functions for saving and loading of neural networks to and from file. They are located in the neuroflow::io module.

use neuroflow::io;
 /*
    In order to save neural network into file call function save from neuroflow::io module.

    First argument is link on the saving neural network;
    Second argument is path to the file.
*/
io::save(&mut nn, "test.flow").unwrap();

/*
    After we have saved the neural network to the file we can restore it by calling
    of load function from neuroflow::io module.

    We must specify the type of new_nn variable.
    The only argument of load function is the path to file containing
    the neural network
*/
let mut new_nn: FeedForward = io::load("test.flow").unwrap();

We did say a little words about DataSet structure. It deserves to be considered more precisely.

Simply saying DataSet is just container for your input vectors and desired output to them, but with additional functionality.

use std::path::Path;
use neuroflow::data::DataSet;

// You can create empty DataSet calling its constructor new
let mut d1 = DataSet::new();

// To push new data to DataSet instance call push method
d1.push(&[0.1, 0.2], &[1.0, 2.3]);
d1.push(&[0.05, 0.01], &[0.5, 1.1]);

// You can load data from csv file
let p = "file.csv";
if Path::new(p).exists(){
    let mut d2 = DataSet::from_csv(p); // Easy, eah?
}

// You can round all DataSet elements with precision
d1.round(2); // 2 is the amount of digits after point

// Also, it is possible to get some statistical information.
// For current version it is possible to get only mean values (by each dimension or by
// other words each column in vector) of input vector and desired output vector
let (x, y) = d1.mean();

Modules

  • Module contains popular neural networks activation functions and theirs derivatives
  • Module contains functions, structs and traits for data storage, access, and processing.
  • The module contains functions, structs, enums, and traits for input/output neural networks. E. g. it can save network to the file and then loads it back.

Structs

  • Feed Forward (multilayer perceptron) neural network that is trained by back propagation algorithm. You can use it for approximation and classification tasks as well.

Enums

  • Custom ErrorKind enum for handling multiple error types

Traits

  • This trait should be implemented by neural network structure when you want it to be transformable to other formats. Note that you, also, need to implement serde::Serialize and serde::Deserialize traits before. Hopefully you can do it easily with derive attribute.