# Neural-Network
A simple neural network written in rust.
## About
This implementation of a neural network using gradient-descent is completely written from ground up using rust.
It is possible to specify the shape of the network, as well as the learning-rate of the network. Additionally, you can choose from one of many predefined datasets, for example the XOR- and CIRCLE Datasets, which represent the relative functions inside the union-square. As well as more complicated datasets like the RGB_DONUT, which represents a donut-like shape with a rainbow like color transition.
Below, you can see a training process, where the network is trying to learn the color-values of the RGB_DONUT dataset.
## Features
The following features are currently implemented:
- **Optimizers**
1. Adam
2. RMSProp
3. SGD
- **Loss Functions**
1. Quadratic
- **Activation Functions**
1. Sigmoid
2. ReLU
- **Layers**
1. Dense
- **Plotting**
1. Plotting the cost-history during training
2. Plotting the final predictions inside, either in grayscale or RGB
## Usage
The process of creating and training the neural network is
pretty straightforwards:

## Example Training Process
Below, you can see how the network learns:
### Learning Animation
<https://user-images.githubusercontent.com/54124311/195410077-7a02b075-0269-4ff2-965f-97f224ab2cf1.mp4>
### Final Result

## Cool training results
### RGB_DONUT
#### Big Network


#### Small Network


### XOR_PROBLEM

