[][src]Trait radiate::models::neat::layers::layer::Layer

pub trait Layer: LayerClone + Any + Debug + Serialize + Deserialize {
    fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>;
fn backward(
        &mut self,
        errors: &Vec<f32>,
        learning_rate: f32
    ) -> Option<Vec<f32>>;
fn as_ref_any(&self) -> &dyn Any;
fn as_mut_any(&mut self) -> &mut dyn Any;
fn shape(&self) -> (usize, usize); fn reset(&mut self) { ... }
fn add_tracer(&mut self) { ... }
fn remove_tracer(&mut self) { ... } }

Layer is a layer in the neural network. In order for the network to be evolved, it must be able to be cloned which is where LayerClone comes in - allowing the Box to be cloned without knowing which type it really is under the hood. Any allows for the underlying object to be downcast to a concrete type

Required methods

fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>

propagate an input vec through this layer. This is done differently depending on the type of layer, just the same as backpropagation is. if the layer is just being evolved, it needs to not keep track of the meta data within because there is no need for the network to backprop after Return the output as a vec

fn backward(
    &mut self,
    errors: &Vec<f32>,
    learning_rate: f32
) -> Option<Vec<f32>>

Take the errors of the feed forward and backpropagate them through the network to adjust the weights of the connections between the neurons. Return the error of the input neurons from this layer - needed to transfer error from layer to layer

fn as_ref_any(&self) -> &dyn Any

Get a reference to the underlying type without generics in order to downcast to a concrete type

fn as_mut_any(&mut self) -> &mut dyn Any

Get a mutable reference to the underlying type without generics in order to downcast to a concrete type

fn shape(&self) -> (usize, usize)

Return the (input size, output size) of this layer - used to make specifying layer sizes easier so the user only needs the say the size of the output, not the input. That would be too redundant.

Loading content...

Provided methods

fn reset(&mut self)

reset the layer, not a nessesary implementation

fn add_tracer(&mut self)

fn remove_tracer(&mut self)

remove the tracer from a layer so that it can be evolved without keeping grack of data

Loading content...

Trait Implementations

impl PartialEq<dyn Layer + 'static> for dyn Layer[src]

Need to able to compare dyn layers (is there a better way to do this?)

Implementors

impl Layer for Dense[src]

fn forward(&mut self, data: &Vec<f32>) -> Option<Vec<f32>>[src]

Feed a vec of inputs through the network, will panic! if the shapes of the values do not match or if something goes wrong within the feed forward process.

fn backward(&mut self, error: &Vec<f32>, learning_rate: f32) -> Option<Vec<f32>>[src]

Backpropagation algorithm, transfer the error through the network and change the weights of the edges accordinly, this is pretty straight forward due to the design of the neat graph

fn add_tracer(&mut self)[src]

add a tracer to the layer to keep track of historical meta data

impl Layer for LSTM[src]

fn backward(
    &mut self,
    errors: &Vec<f32>,
    learning_rate: f32
) -> Option<Vec<f32>>
[src]

apply backpropagation through time

fn reset(&mut self)[src]

reset the lstm network by clearing the tracer and the states as well as the memory and hidden state

fn add_tracer(&mut self)[src]

add tracers to all the gates in the layer

fn remove_tracer(&mut self)[src]

remove the tracers from all the gates in the layer

Loading content...