NeuroFlow is fast neural networks (deep learning) Rust crate. It relies on three pillars: speed, reliability, and speed again.
How to use
Let's try to approximate very simple function 0.5*sin(e^x) - cos(e^(-x))
.
extern crate neuroflow;
use FeedForward;
use DataSet;
use Tanh;
Expected output
for [0.000], [-0.120] -> [-0.119]
for [0.070], [-0.039] -> [-0.037]
for [0.140], [0.048] -> [0.050]
for [0.210], [0.141] -> [0.141]
for [0.280], [0.240] -> [0.236]
But we don't want to lose our trained network so easily. So, there is functionality to save and restore neural networks from files.
/*
In order to save neural network into file call function save from neuroflow::io module.
First argument is link on the saving neural network;
Second argument is path to the file.
*/
save.unwrap;
/*
After we have saved the neural network to the file we can restore it by calling
of load function from neuroflow::io module.
We must specify the type of new_nn variable.
The only argument of load function is the path to file containing
the neural network
*/
let mut new_nn: FeedForward = load.unwrap;
Classic XOR problem (with no classic input of data)
Let's create file named TerribleTom.csv
in the root of project. This file should have following innards:
0,0,-,0
0,1,-,1
1,0,-,1
1,1,-,0
where -
is the delimiter that separates input vector from its desired output vector.
extern crate neuroflow;
use FeedForward;
use DataSet;
use Tanh;
Expected output
for [0.000, 0.000], [0.000] -> [0.000]
for [1.000, 0.000], [1.000] -> [1.000]
for [0.000, 1.000], [1.000] -> [1.000]
for [1.000, 1.000], [0.000] -> [0.000]
Installation
Insert into your project's cargo.toml block next line
[]
= "0.1.3"
Then in project root file
extern crate neuroflow;
License
MIT License
Attribution
The origami bird from logo is made by Freepik