# Smyl - A Machine Learning Library in Rust
Smyl is a machine learning library written in Rust, providing a set of tools and abstractions for building and training
neural networks.
## Features
- **Matrix Operations**: Smyl provides a `Matrix` struct for efficient matrix operations, including addition,
subtraction, multiplication, and more.
- **Activation Functions**: Smyl includes common activation functions such as ReLU and Sigmoid.
- **Layers**: Smyl defines two main layer types: `SignalLayer` and `SynapseLayer`, which can be used to build neural
network architectures.
- **Macros (optional)**: With the `macros` feature enabled, Smyl provides a set of macros to simplify the creation of
neural networks.
- **Idx3 Support (optional)**: With the `idx3` feature enabled, Smyl can read and process data in the IDX3 format,
commonly used for storing images.
## Getting Started
To use Smyl, add the following to your `Cargo.toml` file:
```toml
[dependencies]
smyl = "0.1.1"
```
Then, in your Rust code, you can start with this hello world example
```rust
extern crate rand;
extern crate smyl;
use rand::{thread_rng, Rng};
use smyl::layer::LayerGradient;
use smyl::prelude::*;
fn main() {
let mut rng = thread_rng();
let input: SignalLayer<f64, 4> = Matrix([[1.0, 0.0, 1.0, 0.5]]);
let expected: SignalLayer<f64, 4> = Matrix([[1.0, 0.0, 0.5, 0.75]]);
let learning_rate: f64 = 0.01;
let mut synapses1: SynapseLayer<f64, 4, 2, ReLU> = rng.gen();
let mut synapses2: SynapseLayer<f64, 2, 2, ReLU> = rng.gen();
let mut synapses3: SynapseLayer<f64, 2, 4, ReLU> = rng.gen();
let total_epochs = 1000_000;
let chunk_epochs = 1;
for _ in 0..total_epochs / chunk_epochs {
let mut gradient1 = LayerGradient::default();
let mut gradient2 = LayerGradient::default();
let mut gradient3 = LayerGradient::default();
for _ in 0..chunk_epochs {
let output1 = synapses1.forward(input);
let output2 = synapses2.forward(output1);
let output3 = synapses3.forward(output2);
let output_gradient3 = output3 - expected;
let output_gradient2 =
synapses3.backward(output2, output3, output_gradient3, &mut gradient3);
let output_gradient1 =
synapses2.backward(output1, output2, output_gradient2, &mut gradient2);
synapses1.backward(input, output1, output_gradient1, &mut gradient1);
dbg!(output3);
}
synapses3.apply_gradient(gradient3, learning_rate);
synapses2.apply_gradient(gradient2, learning_rate);
synapses1.apply_gradient(gradient1, learning_rate);
}
}
```
But you might prefer using the macro:
```rust
use smyl::prelude::*;
fn main() {
let mut ann = ann!(4, 6, 4);
let input = Matrix([[1.0, 0.0, 1.0, 0.5]]);
let expected = Matrix([[1.0, 0.0, 0.0, 0.0]]);
let batch = 100;
let learning_rate = 0.01;
for i in 0..10000 {
let output = ann.forward(input);
ann.backward(expected);
if i % batch == 0 {
ann.apply_chunk_gradient(learning_rate * batch as f64);
println!("{output:?}");
}
}
}
```
## Examples
You can find more example usage of Smyl in the `examples` directory of the repository:
- [**Hello World**](examples/hello.rs): A simple example demonstrating the basic usage of the library.
- [**Macros**](examples/macro.rs): An example showcasing the use of macros for creating neural networks.
- [**MNIST**](examples/mnist/main.rs): A more complex example that demonstrates training a neural network on the MNIST
dataset.
# Documentation
For more detailed documentations, please refer to the [docs.rs](https://docs.rs/smyl/latest/smyl/) page.
## Contributing
Contributions are welcome! Please feel free to submit a pull request or open an issue for any suggestions or
improvements.
## License
This project is licensed under the GNU GPLv3 License. See the [LICENSE](LICENSE) file for details.
## Acknowledgments
- Thanks to the Rust community for their support and contributions.
- Special thanks to the [3Blue1Brown](https://www.3blue1brown.com/) channel for vulgarizing
mathematics involved in machine learning