Struct neuroflow::FeedForward
source · pub struct FeedForward { /* private fields */ }
Expand description
Feed Forward (multilayer perceptron) neural network that is trained by back propagation algorithm. You can use it for approximation and classification tasks as well.
Examples
In order to create FeedForward
instance call its constructor new
.
The constructor accepts slice as an argument. This slice determines the architecture of network. First element in slice is amount of neurons in input layer and the last one is amount of neurons in output layer. Denote, that vector of input data must have the equal length as input layer of FeedForward neural network (the same is for expected output vector).
use neuroflow::FeedForward;
let mut nn = FeedForward::new(&[1, 3, 2]);
Then you can train your network simultaneously via fit
method:
nn.fit(&[1.2], &[0.2, 0.8]);
Or to use train
method with neuroflow::data::DataSet
struct:
use neuroflow::data::DataSet;
let mut data = DataSet::new();
data.push(&[1.2], &[1.3, -0.2]);
nn.train(&data, 30_000); // 30_000 is iterations count
It is possible to set parameters of network:
nn.learning_rate(0.1)
.momentum(0.05)
.activation(neuroflow::activators::Type::Tanh);
Call method calc
in order to calculate value by your(already trained) network:
let d: Vec<f64> = nn.calc(&[1.02]).to_vec();
Implementations§
source§impl FeedForward
impl FeedForward
sourcepub fn new(architecture: &[i32]) -> FeedForward
pub fn new(architecture: &[i32]) -> FeedForward
The constructor of FeedForward
struct
-
architecture: &[i32]
- the architecture of network where each element in slice represents amount of neurons in this layer. First element in slice is amount of neurons in input layer and the last one is amount of neurons in output layer. Denote, that vector of input data must have the equal length as input layer of FeedForward neural network (the same is for expected output vector). -
return
-FeedForward
struct
Example
use neuroflow::FeedForward;
let mut nn = FeedForward::new(&[1, 3, 2]);
sourcepub fn bind(&mut self, layer: usize, neuron: usize)
pub fn bind(&mut self, layer: usize, neuron: usize)
Bind a new neuron to layer. It initializes neuron with random weights.
layer: usize
- index of layer. NOTE, layer indexing starts from 1!neuron: usize
- index of neuron. NOTE, neurons indexing in layer starts from 0!
Examples
nn.bind(2, 0);
sourcepub fn unbind(&mut self, layer: usize, neuron: usize)
pub fn unbind(&mut self, layer: usize, neuron: usize)
Unbind neuron from layer.
layer: usize
- index of layer. NOTE, layer indexing starts from 1!neuron: usize
- index of neuron. NOTE, neurons indexing in layer starts from 0!
Examples
nn.unbind(2, 0);
sourcepub fn train<T>(&mut self, data: &T, iterations: i64)where
T: Extractable,
pub fn train<T>(&mut self, data: &T, iterations: i64)where T: Extractable,
Train neural network by bulked data.
data: &T
- the link on data that implementsneuroflow::data::Extractable
trait;iterations: i64
- iterations count.
Examples
let mut d = neuroflow::data::DataSet::new();
d.push(&[1.2], &[1.3, -0.2]);
nn.train(&d, 30_000);
sourcepub fn fit(&mut self, X: &[f64], d: &[f64])
pub fn fit(&mut self, X: &[f64], d: &[f64])
Train neural network simultaneously step by step
X: &[f64]
- slice of input data;d: &[f64]
- expected output.
Examples
nn.fit(&[3.0], &[3.0, 5.0]);
sourcepub fn calc(&mut self, X: &[f64]) -> &[f64]
pub fn calc(&mut self, X: &[f64]) -> &[f64]
Calculate the response by trained neural network.
X: &[f64]
- slice of input data;return -> &[f64]
- slice of calculated data.
Examples
let v: Vec<f64> = nn.calc(&[1.02]).to_vec();
sourcepub fn activation(&mut self, func: Type) -> &mut FeedForward
pub fn activation(&mut self, func: Type) -> &mut FeedForward
Choose activation function. Note
that if you pass activators::Type::Custom
as argument of this method, the default value (activators::Type::Tanh
) will
be used.
func: neuroflow::activators::Type
- enum element that indicates which function to use;return -> &mut FeedForward
- link on the current struct.
sourcepub fn custom_activation(
&mut self,
func: fn(_: f64) -> f64,
der: fn(_: f64) -> f64
) -> &mut FeedForward
pub fn custom_activation( &mut self, func: fn(_: f64) -> f64, der: fn(_: f64) -> f64 ) -> &mut FeedForward
Set custom activation function and its derivative.
Activation type is set to activators::Type::Custom
.
func: fn(f64) -> f64
- activation function to be set;der: fn(f64) -> f64
- derivative of activation function;return -> &mut FeedForward
- link on the current struct.
Warning
Be careful using custom activation function. For good results this function should be smooth, non-decreasing, and differentiable.
Example
fn sigmoid(x: f64) -> f64{
1.0/(1.0 + x.exp())
}
fn der_sigmoid(x: f64) -> f64{
sigmoid(x)*(1.0 - sigmoid(x))
}
let mut nn = FeedForward::new(&[1, 3, 2]);
nn.custom_activation(sigmoid, der_sigmoid);
sourcepub fn learning_rate(&mut self, learning_rate: f64) -> &mut FeedForward
pub fn learning_rate(&mut self, learning_rate: f64) -> &mut FeedForward
Set the learning rate of network.
learning_rate: f64
- learning rate;return -> &mut FeedForward
- link on the current struct.
Examples
nn.learning_rate(0.1);
sourcepub fn momentum(&mut self, momentum: f64) -> &mut FeedForward
pub fn momentum(&mut self, momentum: f64) -> &mut FeedForward
Set the momentum of network.
momentum: f64
- momentum;return -> &mut FeedForward
- link on the current struct.
Example
nn.momentum(0.05);