[][src]Struct radiate::models::neat::layers::dense::Dense

pub struct Dense {
    pub inputs: Vec<Uuid>,
    pub outputs: Vec<Uuid>,
    pub nodes: HashMap<Uuid, *mut Neuron>,
    pub edges: HashMap<Uuid, Edge>,
    pub trace_states: Option<Tracer>,
    pub layer_type: LayerType,
    pub activation: Activation,


inputs: Vec<Uuid>outputs: Vec<Uuid>nodes: HashMap<Uuid, *mut Neuron>edges: HashMap<Uuid, Edge>trace_states: Option<Tracer>layer_type: LayerTypeactivation: Activation


impl Dense[src]

pub fn new(
    num_in: u32,
    num_out: u32,
    layer_type: LayerType,
    activation: Activation
) -> Self

create a new fully connected dense layer. Each input is connected to each output with a randomly generated weight attached to the connection

pub fn get_outputs(&self) -> Option<Vec<f32>>[src]

get the outputs from the layer in a vec form

pub fn add_node(&mut self, activation: Activation)[src]

Add a node to the network by getting a random edge and inserting the new node inbetween that edge's source and destination nodes. The old weight is pushed forward while the new weight is randomly chosen and put between the old source node and the new node

pub fn add_edge(&mut self)[src]

add a connection to the network. Randomly get a sending node that cannot be an output and a receiving node which is not an input node, the validate that the desired connection can be made. If it can be, make the connection with a weight of .5 in order to minimally impact the network

pub fn get_output_states(&self) -> Vec<f32>[src]

get the states of the output neurons. This allows softmax and other specific actions to be taken where knowledge of more than just the immediate neuron's state must be known

pub fn set_output_values(&mut self)[src]

Because the output neurons might need to be seen togehter, this must be called to set their values before finishing the feed forward function

pub fn update_traces(&mut self)[src]

take a snapshot of the neuron's values at this time step if trace is enabled

Trait Implementations

impl Clone for Dense[src]

Implement clone for the neat neural network in order to facilitate proper crossover and mutation for the network

impl Debug for Dense[src]

impl<'de> Deserialize<'de> for Dense[src]

implement deserialize for dense layer - because the layer uses raw pointers, this needs to be implemented manually which is kinda a pain in the ass but this is the only one that needs to be done manually - everything else is derived

impl Display for Dense[src]

Simple override of display for neat to debug a little cleaner

impl Drop for Dense[src]

Because the tree is made out of raw mutable pointers, if those pointers are not dropped, there is a severe memory leak, like possibly gigs of ram over only a few generations depending on the size of the generation This drop implementation will recursivley drop all nodes in the tree

impl Genome<Dense, NeatEnvironment> for Dense where
    Dense: Layer

impl Layer for Dense[src]

fn forward(&mut self, data: &Vec<f32>) -> Option<Vec<f32>>[src]

Feed a vec of inputs through the network, will panic! if the shapes of the values do not match or if something goes wrong within the feed forward process.

fn backward(&mut self, error: &Vec<f32>, learning_rate: f32) -> Option<Vec<f32>>[src]

Backpropagation algorithm, transfer the error through the network and change the weights of the edges accordinly, this is pretty straight forward due to the design of the neat graph

fn add_tracer(&mut self)[src]

add a tracer to the layer to keep track of historical meta data

impl PartialEq<Dense> for Dense[src]

Implement partialeq for neat because if neat itself is to be used as a problem, it must be able to compare one to another

impl Send for Dense[src]

These must be implemneted for the network or any type to be used within seperate threads. Because implementing the functions themselves is dangerious and unsafe and i'm not smart enough to do that from scratch, these "implmenetaions" will get rid of the error and realistically they don't need to be implemneted for the program to work

impl Serialize for Dense[src]

manually implement serialize for dense because it uses raw pointrs so it cannot be derived due to there being no way to serialie and deserailize raw pointers

impl Sync for Dense[src]

Auto Trait Implementations

impl RefUnwindSafe for Dense

impl Unpin for Dense

impl UnwindSafe for Dense

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized

impl<T> Any for T where
    T: Any + Serialize + Deserialize

impl<T> Borrow<T> for T where
    T: ?Sized

impl<T> BorrowMut<T> for T where
    T: ?Sized

impl<T> Debug for T where
    T: Debug + Serialize + Deserialize + ?Sized

impl<T> Deserialize for T where
    T: DeserializeOwned

impl<T> DeserializeOwned for T where
    T: Deserialize<'de>, 

impl<T> Display for T where
    T: Display + Serialize + Deserialize + ?Sized

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 

impl<L> LayerClone for L where
    L: 'static + Layer + Clone

impl<T> Same<T> for T

type Output = T

Should always be Self

impl<T> Serialize for T where
    T: Serialize + ?Sized

impl<T> Serialize for T where
    T: Serialize + ?Sized

impl<T> ToOwned for T where
    T: Clone

type Owned = T

The resulting type after obtaining ownership.

impl<T> ToString for T where
    T: Display + ?Sized

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<T> Type for T[src]

type Meta = Concrete

Type of metadata for type.

impl<T> Type for T where
    T: ?Sized

impl<V, T> VZip<V> for T where
    V: MultiLane<T>,