MiniNN
A minimalist deep learnig crate for rust.
✏️ Usage
For this example we will resolve the classic XOR problem
use ;
use *;
Output
Epoch 1/1000 - Loss: 0.37767715592285533, Time: 0.000301444 sec
Epoch 2/1000 - Loss: 0.3209450799267143, Time: 0.000216753 sec
Epoch 3/1000 - Loss: 0.3180416337628711, Time: 0.00022032 sec
...
Epoch 998/1000 - Loss: 0.000011881245192030034, Time: 0.00021529 sec
Epoch 999/1000 - Loss: 0.000011090649737601982, Time: 0.000215882 sec
Epoch 1000/1000 - Loss: 0.000011604905569853055, Time: 0.000215721 sec
Training Completed!
Total Training Time: 0.22 sec
Predictions:
[0, 0] --> 0
[0, 1] --> 1
[1, 0] --> 1
[1, 1] --> 0
Confusion matrix:
[[2, 0],
[0, 2]]
Accuracy: 1
Recall: 1
Precision: 1
F1: 1
Loss: 0.000011604905569853055
Model saved successfully!
Metrics
You can also calculate metrics for your models using MetricsCalculator:
let metrics = new;
println!;
println!;
This is the output of the iris example
Confusion matrix:
[[26, 0, 0],
[0, 28, 1],
[0, 2, 18]]
Accuracy: 0.96
Recall: 0.9551724137931035
Precision: 0.960233918128655
F1: 0.9574098218166016
Default Layers
For now, the crate only offers two types of layers:
| Layer | Description |
|---|---|
Dense |
Fully connected layer where each neuron connects to every neuron in the previous layer. It computes the weighted sum of inputs, adds a bias term, and applies an optional activation function (e.g., ReLU, Sigmoid). This layer is fundamental for transforming input data in deep learning models. |
Activation |
Applies a non-linear transformation (activation function) to its inputs. Common activation functions include ReLU, Sigmoid, Tanh, and Softmax. These functions introduce non-linearity to the model, allowing it to learn complex patterns. |
[!NOTE] More layers in the future.
Save and load models
When you already have a trained model you can save it into a HDF5 file:
nn.save.unwrap;
let mut nn = NNload.unwrap;
Custom layers
All the layers that are in the network needs to implement the Layer trait, so is possible for users to create their own custom layers.
The only rule is that all the layers must implements the following traits (instead of the Layer trait):
Debug: Standars traits.Clone: Standars traits.SerializeandDeserialize: Fromserdecrate.
Here is a little example about how to create custom layers:
use *;
use ;
use serde_json;
use Array1;
// The implementation of the custom layer
;
// Implement the Layer trait for the custom layer
If you want to use a model with a custom layer, you need to add it into the LayerRegister, this is a data structure that stored all the types of layers that the NN struct is going to accept.
📖 Add the library to your project
You can add the crate with cargo
cargo add mininn
Alternatively, you can manually add it to your project's Cargo.toml like this:
[]
= "*" # Change the `*` to the current version
Examples
There is a multitude of examples resolving classics ML problems, if you want to see the results just run these commands.
cargo run --example xor
cargo run --example iris
cargo run --example mnist
cargo run --example xor_load_nn
cargo run --example mnist_load_nn
📑 Libraries used
- rand - For Random stuffs.
- ndarray - For manage N-Dimensional Arrays.
- ndarray-rand - For manage Random N-Dimensional Arrays.
- serde - For serialization.
- serde_json - For JSON serialization.
- hdf5 - For model storage.
🏁 TODOs
- Try to improve the register system
- Add optimizers
- Add Conv2D (try Conv3D) layer
🔑 License
MIT - Created by Paco Algar.