Struct neuroflow::FeedForward
[−]
[src]
pub struct FeedForward { /* fields omitted */ }
Feed Forward (multilayer perceptron) neural network that is trained by back propagation algorithm. You can use it for approximation and classification tasks as well.
Examples
In order to create FeedForward
instance call its constructor new
.
The constructor accepts slice as an argument. This slice determines the architecture of network. First element in slice is amount of neurons in input layer and the last one is amount of neurons in output layer. Denote, that vector of input data must have the equal length as input layer of FeedForward neural network (the same is for expected output vector).
use neuroflow::FeedForward; let mut nn = FeedForward::new(&[1, 3, 2]);
Then you can train your network simultaneously via fit
method:
nn.fit(&[1.2], &[0.2, 0.8]);
Or to use train
method with neuroflow::data::DataSet
struct:
use neuroflow::data::DataSet; let mut data = DataSet::new(); data.push(&[1.2], &[1.3, -0.2]); nn.train(&data, 30_000); // 30_000 is iterations count
It is possible to set parameters of network: ```rust
use neuroflow::FeedForward;
let mut nn = FeedForward::new(&[1, 3, 2]);
nn.learning_rate(0.1) .momentum(0.05) .activation(neuroflow::activators::Type::Tanh); ```
Call method calc
in order to calculate value by your(already trained) network:
let d: Vec<f64> = nn.calc(&[1.02]).to_vec();
Methods
impl FeedForward
[src]
fn new(architecture: &[i32]) -> FeedForward
[src]
The constructor of FeedForward
struct
architecture: &[i32]
- the architecture of network where each element in slice represents amount of neurons in this layer. First element in slice is amount of neurons in input layer and the last one is amount of neurons in output layer. Denote, that vector of input data must have the equal length as input layer of FeedForward neural network (the same is for expected output vector).return
-FeedForward
structExample
use neuroflow::FeedForward; let mut nn = FeedForward::new(&[1, 3, 2]);
fn bind(&mut self, layer: usize, neuron: usize)
[src]
Bind a new neuron to layer. It initializes neuron with random weights.
layer: usize
- index of layer. NOTE, layer indexing starts from 1!neuron: usize
- index of neuron. NOTE, neurons indexing in layer starts from 0!
Examples
nn.bind(2, 0);
fn unbind(&mut self, layer: usize, neuron: usize)
[src]
Unbind neuron from layer.
layer: usize
- index of layer. NOTE, layer indexing starts from 1!neuron: usize
- index of neuron. NOTE, neurons indexing in layer starts from 0!
Examples
nn.unbind(2, 0);
fn train<T>(&mut self, data: &T, iterations: i64) where
T: Extractable,
[src]
T: Extractable,
Train neural network by bulked data.
data: &T
- the link on data that implementsneuroflow::data::Extractable
trait;iterations: i64
- iterations count.
Examples
let mut d = neuroflow::data::DataSet::new(); d.push(&[1.2], &[1.3, -0.2]); nn.train(&d, 30_000);
fn fit(&mut self, X: &[f64], d: &[f64])
[src]
Train neural network simultaneously step by step
X: &[f64]
- slice of input data;d: &[f64]
- expected output.
Examples
nn.fit(&[3.0], &[3.0, 5.0]);
fn calc(&mut self, X: &[f64]) -> &[f64]
[src]
Calculate the response by trained neural network.
X: &[f64]
- slice of input data;return -> &[f64]
- slice of calculated data.
Examples
let v: Vec<f64> = nn.calc(&[1.02]).to_vec();
fn activation(&mut self, func: Type) -> &mut FeedForward
[src]
Choose activation function.
func: neuroflow::activators::Type
- enum element that indicates which function to use;return -> &mut FeedForward
- link on the current struct.
fn learning_rate(&mut self, learning_rate: f64) -> &mut FeedForward
[src]
Set the learning rate of network.
learning_rate: f64
- learning rate;return -> &mut FeedForward
- link on the current struct.
Examples
nn.learning_rate(0.1);
fn momentum(&mut self, momentum: f64) -> &mut FeedForward
[src]
Set the momentum of network.
momentum: f64
- momentum;return -> &mut FeedForward
- link on the current struct.
Example
nn.momentum(0.05);