[−][src]Trait radiate::models::neat::layers::layer::Layer
Layer is a layer in the neural network. In order for
the network to be evolved, it must be able to be cloned which is where LayerClone
comes in - allowing the Box
Required methods
fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>
propagate an input vec through this layer. This is done differently depending on the type of layer, just the same as backpropagation is. if the layer is just being evolved, it needs to not keep track of the meta data within because there is no need for the network to backprop after Return the output as a vec
fn backward(
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
Take the errors of the feed forward and backpropagate them through the network to adjust the weights of the connections between the neurons. Return the error of the input neurons from this layer - needed to transfer error from layer to layer
fn as_ref_any(&self) -> &dyn Any
Get a reference to the underlying type without generics in order to downcast to a concrete type
fn as_mut_any(&mut self) -> &mut dyn Any
Get a mutable reference to the underlying type without generics in order to downcast to a concrete type
fn shape(&self) -> (usize, usize)
Return the (input size, output size) of this layer - used to make specifying layer sizes easier so the user only needs the say the size of the output, not the input. That would be too redundant.
Provided methods
fn reset(&mut self)
reset the layer, not a nessesary implementation
fn add_tracer(&mut self)
fn remove_tracer(&mut self)
remove the tracer from a layer so that it can be evolved without keeping grack of data
Trait Implementations
impl PartialEq<dyn Layer + 'static> for dyn Layer
[src]
Need to able to compare dyn layers (is there a better way to do this?)
impl<'typetag> Serialize for dyn Layer + 'typetag
[src]
impl<'typetag> Serialize for dyn Layer + Send + 'typetag
[src]
impl<'typetag> Serialize for dyn Layer + Sync + 'typetag
[src]
impl<'typetag> Serialize for dyn Layer + Send + Sync + 'typetag
[src]
impl Strictest for dyn Layer
[src]
Implementors
impl Layer for Dense
[src]
fn forward(&mut self, data: &Vec<f32>) -> Option<Vec<f32>>
[src]
Feed a vec of inputs through the network, will panic! if the shapes of the values do not match or if something goes wrong within the feed forward process.
fn backward(&mut self, error: &Vec<f32>, learning_rate: f32) -> Option<Vec<f32>>
[src]
Backpropagation algorithm, transfer the error through the network and change the weights of the edges accordinly, this is pretty straight forward due to the design of the neat graph
fn reset(&mut self)
[src]
fn add_tracer(&mut self)
[src]
add a tracer to the layer to keep track of historical meta data
fn remove_tracer(&mut self)
[src]
fn as_ref_any(&self) -> &dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,
fn as_mut_any(&mut self) -> &mut dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,
fn shape(&self) -> (usize, usize)
[src]
impl Layer for GRU
[src]
implement the layer trait for the GRU so it can be stored in the neat network
fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>
[src]
implement the propagation function for the GRU layer
fn backward(
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
[src]
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
fn as_ref_any(&self) -> &dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,
fn as_mut_any(&mut self) -> &mut dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,
fn shape(&self) -> (usize, usize)
[src]
impl Layer for LSTM
[src]
fn forward(&mut self, inputs: &Vec<f32>) -> Option<Vec<f32>>
[src]
forward propagate inputs, if the model is being evolved don't spawn extra threads because it slows down the process by about double the original time. If the model is being trained traditionally, step forward asynconously by spawnin a thread for each individual gate which results in speeds about double as a synconous thread.
fn backward(
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
[src]
&mut self,
errors: &Vec<f32>,
learning_rate: f32
) -> Option<Vec<f32>>
apply backpropagation through time asyncronously because this is not done during evolution
fn reset(&mut self)
[src]
reset the lstm network by clearing the tracer and the states as well as the memory and hidden state
fn add_tracer(&mut self)
[src]
add tracers to all the gate.write().unwrap()s in the layer
fn remove_tracer(&mut self)
[src]
remove the tracers from all the gate.write().unwrap()s in the layer
fn as_ref_any(&self) -> &dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,
fn as_mut_any(&mut self) -> &mut dyn Any where
Self: Sized + 'static,
[src]
Self: Sized + 'static,