Expand description
§mynn
An hobbyist no-std neural network library.
§Explanation
This is a small library (currently ~200 lines minus doc comments and helper macros) I initially created during my lunch break when I had attempted to represent the shape of a neural network in Rust’s type system, the result was I was able to make all the vectors into fixed sized arrays and allow the neural network to be no-std and in theory usable on microcontroller and embedded platforms.
See this example of a pre-trained model approximating an XOR running on an ATtiny85.
§Installation
Command line:
cargo add mynn
Cargo.toml:
mynn = "0.1.2"
To use f32
in all operations, supply the f32
flag:
mynn = { version = "0.1.2", features = ["f32"] }
To remove recursion, use recurse-opt
:
mynn = { version = "0.1.1", features = ["recurse-opt"] }
This will cause all the recursive method calls on each layer to be inlined, on larger models this may increase the size of the generated code, tradeoffs need to be considered.
§Example
Short example approximates the output of a XOR gate.
use mynn::make_network;
use mynn::activations::SIGMOID;
fn main() {
let inputs = [[0.0, 0.0], [0.0, 1.0], [1.0, 0.0], [1.0, 1.0]];
let targets = [[0.0], [1.0], [1.0], [0.0]];
let mut network = make_network!(2, 3, 1);
network.train(0.5, inputs, targets, 10_000, &SIGMOID);
println!("0 and 0: {:?}", network.predict([0.0, 0.0], &SIGMOID));
println!("1 and 0: {:?}", network.predict([1.0, 0.0], &SIGMOID));
println!("0 and 1: {:?}", network.predict([0.0, 1.0], &SIGMOID));
println!("1 and 1: {:?}", network.predict([1.0, 1.0], &SIGMOID));
}
Modules§
- activations
- Contains types for and an example activation function.
- matrix
- Contains the types and functionality for processing matrices.
- network
- Contains the types and functionality for the neural network.
Macros§
- make_
network - Helper macro used to initialize a neural network, simply pass a comma separated list the number of neurons for each layer, works for any sized neural network.