Modular neural networks in Rust.
Create modular neural networks in Rust with ease! For educational purposes; operations are not throughly optimized.
Quickstart
use Shape;
use Network;
use Activation;
use Optimizer;
use Objective;
Examples can be found in the examples directory.
Progress
-
Layer types
- Dense
- Convolutional
- Forward pass
- Padding
- Stride
- Dilation
- Backward pass
- Padding
- Stride
- Dilation
- Forward pass
-
Activation functions
- Linear
- Sigmoid
- Tanh
- ReLU
- LeakyReLU
- Softmax
-
Objective functions
- AE
- MAE
- MSE
- RMSE
- CrossEntropy
- BinaryCrossEntropy
- KLDivergence
-
Optimization techniques
- SGD
- SGDM
- Adam
- AdamW
- RMSprop
- Minibatch
-
Architecture
- Feedforward (dubbed
Network) - Convolutional
- Recurrent
- Feedback connections
- Dense to Dense
- Dense to Convolutional
- Convolutional to Dense
- Convolutional to Convolutional
- Feedforward (dubbed
-
Regularization
- Dropout
- Batch normalization
- Early stopping
-
Parallelization
- Multi-threading
-
Testing
- Unit tests
- Thorough testing of algebraic operations
- Thorough testing of activation functions
- Thorough testing of objective functions
- Thorough testing of optimization techniques
- Thorough testing of feedback scaling (wrt. gradients)
- Integration tests
- Unit tests
-
Examples
- XOR
- Iris
- MLP
- MLP + Feedback
- Linear regression
- MLP
- MLP + Feedback
- Classification TBA.
- MLP
- MLP + Feedback
- MNIST
- MLP
- MLP + Feedback
- CNN
- CNN + Feedback
- CIFAR-10
- CNN
- CNN + Feedback
-
Other
- Documentation
- Custom random weight initialization
- Custom tensor type
- Plotting
- Data from file
- General data loading functionality
- Custom icon/image for documentation
- Custom stylesheet for documentation
- Type conversion (e.g. f32, f64)
- Network type specification (e.g. f32, f64)
- Saving and loading
- Single layer weights
- Entire network weights
- Custom (binary) file format, with header explaining contents
- Logging