[][src]Trait radiate::models::neat::layers::layer::Layer

pub trait Layer: LayerClone + Any + Debug + Send + Sync + Serialize + Deserialize {
    fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>;
fn backward(
        &mut self,
        errors: &Vec<f32>,
        learning_rate: f32
    ) -> Option<Vec<f32>>;
fn as_ref_any(&self) -> &dyn Any;
fn as_mut_any(&mut self) -> &mut dyn Any;
fn shape(&self) -> (usize, usize); fn reset(&mut self) { ... }
fn add_tracer(&mut self) { ... }
fn remove_tracer(&mut self) { ... } }

Layer is a layer in the neural network. In order for the network to be evolved, it must be able to be cloned which is where LayerClone comes in - allowing the Box to be cloned without knowing which type it really is under the hood. Any allows for the underlying object to be downcast to a concrete type

Required methods

fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>

propagate an input vec through this layer. This is done differently depending on the type of layer, just the same as backpropagation is. if the layer is just being evolved, it needs to not keep track of the meta data within because there is no need for the network to backprop after Return the output as a vec

fn backward(
    &mut self,
    errors: &Vec<f32>,
    learning_rate: f32
) -> Option<Vec<f32>>

Take the errors of the feed forward and backpropagate them through the network to adjust the weights of the connections between the neurons. Return the error of the input neurons from this layer - needed to transfer error from layer to layer

fn as_ref_any(&self) -> &dyn Any

Get a reference to the underlying type without generics in order to downcast to a concrete type

fn as_mut_any(&mut self) -> &mut dyn Any

Get a mutable reference to the underlying type without generics in order to downcast to a concrete type

fn shape(&self) -> (usize, usize)

Return the (input size, output size) of this layer - used to make specifying layer sizes easier so the user only needs the say the size of the output, not the input. That would be too redundant.

Loading content...

Provided methods

fn reset(&mut self)

reset the layer, not a necessary implementation

fn add_tracer(&mut self)

fn remove_tracer(&mut self)

remove the tracer from a layer so that it can be evolved without keeping track of data

Loading content...

Trait Implementations

impl PartialEq<dyn Layer + 'static> for dyn Layer[src]

Need to able to compare dyn layers (is there a better way to do this?)

impl<'typetag> Serialize for dyn Layer + 'typetag[src]

impl<'typetag> Serialize for dyn Layer + Send + 'typetag[src]

impl<'typetag> Serialize for dyn Layer + Sync + 'typetag[src]

impl<'typetag> Serialize for dyn Layer + Send + Sync + 'typetag[src]

impl Strictest for dyn Layer[src]

type Object = dyn Layer + Send + Sync

Implementors

impl Layer for Dense[src]

fn forward(&mut self, data: &Vec<f32>) -> Option<Vec<f32>>[src]

Feed a vec of inputs through the network, will panic! if the shapes of the values do not match or if something goes wrong within the feed forward process.

fn backward(&mut self, error: &Vec<f32>, learning_rate: f32) -> Option<Vec<f32>>[src]

Backpropagation algorithm, transfer the error through the network and change the weights of the edges accordingly, this is pretty straightforward due to the design of the neat graph

fn add_tracer(&mut self)[src]

add a tracer to the layer to keep track of historical meta data

impl Layer for GRU[src]

implement the layer trait for the GRU so it can be stored in the neat network

fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>[src]

implement the propagation function for the GRU layer

impl Layer for LSTM[src]

fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>[src]

forward propagate inputs, if the model is being evolved don't spawn extra threads because it slows down the process by about double the original time. If the model is being trained traditionally, step forward asynchronously by spawning a thread for each individual gate which results in speeds about double as a synchronous thread.

fn backward(
    &mut self,
    errors: &Vec<f32>,
    learning_rate: f32
) -> Option<Vec<f32>>
[src]

apply backpropagation through time asynchronously because this is not done during evolution

fn reset(&mut self)[src]

reset the lstm network by clearing the tracer and the states as well as the memory and hidden state

fn add_tracer(&mut self)[src]

add tracers to all the gate.write().unwrap()s in the layer

fn remove_tracer(&mut self)[src]

remove the tracers from all the gate.write().unwrap()s in the layer

Loading content...