Modular neural networks in Rust.
Create modular neural networks in Rust with ease!
Create a network
use ;
Examples
Examples can be found in the examples directory.
0.3.0 (Batched training; parallelization)
- Batched training (
network::Network::learn) - Parallelization of batches (
rayon)
Benchmarking example/example_benchmark.rs
v0.3.0: 0.318811179s (6.95x speedup)
v0.2.2: 2.218362758s
0.2.2 (Convolution)
- Convolutional layer
- Improved documentation
0.2.0 (Feedback)
- Feedback connections
0.1.5
- Improved documentation
0.1.1
- Custom tensor struct
- Unit tests
0.1.0 (Dense)
- Dense feedforward network
- Activation functions
- Objective functions
- Optimization techniques
Layer types
- [x] Dense
- [x] Convolutional
- [x] Forward pass
- [x] Padding
- [x] Stride
- [ ] Dilation
- [x] Backward pass
- [x] Padding
- [x] Stride
- [ ] Dilation
- [x] Max pooling
Activation functions
- [x] Linear
- [x] Sigmoid
- [x] Tanh
- [x] ReLU
- [x] LeakyReLU
- [x] Softmax
Objective functions
- [x] AE
- [x] MAE
- [x] MSE
- [x] RMSE
- [x] CrossEntropy
- [x] BinaryCrossEntropy
- [x] KLDivergence
Optimization techniques
- [x] SGD
- [x] SGDM
- [x] Adam
- [x] AdamW
- [x] RMSprop
- [x] Minibatch
Architecture
- [x] Feedforward (dubbed `Network`)
- [x] Convolutional
- [ ] Recurrent
- [ ] Feedback connections
- [x] Dense to Dense
- [ ] Dense to Convolutional
- [ ] Convolutional to Dense
- [ ] Convolutional to Convolutional
Regularization
- [x] Dropout
- [ ] Batch normalization
- [ ] Early stopping
Parallelization
- [x] Parallelization of batches
- [ ] Other parallelization?
- NOTE: Slowdown when parallelizing _everything_ (commit: 1f94cea56630a46d40755af5da20714bc0357146).
Testing
- [x] Unit tests
- [x] Thorough testing of algebraic operations
- [x] Thorough testing of activation functions
- [x] Thorough testing of objective functions
- [x] Thorough testing of optimization techniques
- [ ] Thorough testing of feedback scaling (wrt. gradients)
- [ ] Integration tests
- [x] Network forward pass
- [x] Network backward pass
- [ ] Network training (i.e., weight updates)
Examples
- [x] XOR
- [x] Iris
- [x] MLP
- [ ] MLP + Feedback
- [ ] Linear regression
- [ ] MLP
- [ ] MLP + Feedback
- [ ] Classification TBA.
- [ ] MLP
- [ ] MLP + Feedback
- [ ] MNIST
- [ ] MLP
- [ ] MLP + Feedback
- [x] CNN
- [ ] CNN + Feedback
- [ ] CIFAR-10
- [ ] CNN
- [ ] CNN + Feedback
Other
- [x] Documentation
- [x] Custom random weight initialization
- [x] Custom tensor type
- [x] Plotting
- [x] Data from file
- [ ] General data loading functionality
- [x] Custom icon/image for documentation
- [x] Custom stylesheet for documentation
- [ ] Type conversion (e.g. f32, f64)
- [ ] Network type specification (e.g. f32, f64)
- [ ] Saving and loading
- [ ] Single layer weights
- [ ] Entire network weights
- [ ] Custom (binary) file format, with header explaining contents
- [ ] Logging
- [x] Add number of parameters when displaying `Network`
Sources
- backpropagation
- softmax
- momentum
- Adam
- AdamW
- RMSprop
- backpropagation convolution 1
- backpropagation convolution 2
- backpropagation convolution 3