Struct NeuralNetwork

Source
pub struct NeuralNetwork { /* private fields */ }
Expand description

The NeuralNetwork type from which all learning functions come from, stores data for the network and gives access to functions which can be used to access, run, and mutate the network.

use smarty_pants::neural_network::NeuralNetwork;

let mut brain:NeuralNetwork = NeuralNetwork::new(1.0,10,10,3);

brain.set_wheight(10.0,(5,7));
assert!(brain.get_wheight((5,7)).unwrap() == 10.0);

let output:Vec<f64> = brain.run(&vec![1.0,2.0,3.0,4.0,5.0]);

Implementations§

Source§

impl NeuralNetwork

Source

pub fn new( default_wheight: f64, hidden_layers: usize, hidden_neurons_per_layer: usize, outputs: usize, ) -> NeuralNetwork

Creates a new NeuralNetwork using the specified arguments.

Examples found in repository?
examples/load_save.rs (line 12)
10fn main() {
11    // Create a `NeuralNetwork`.
12    let mut network:NeuralNetwork = NeuralNetwork::new(1.0,1,3,1);
13    // Run the network and save the output.
14    let output:f64 = network.run(&vec![1.0])[0];
15    // Save it to file.
16    write_file("example.brain", &network).unwrap();
17    // Load it.
18    let mut loaded_network:NeuralNetwork = read_file("example.brain").unwrap();
19    // Call `run()` on the newly loaded `NeuralNetwork`.
20    let output2:f64 = loaded_network.run(&vec![1.0])[0];
21    // Show that the `NeuralNetwork` functions the same as it did before.
22    assert!(output == output2);
23}
More examples
Hide additional examples
examples/simple_use_case.rs (line 38)
21fn main() {
22    /// This is our target, we will consider `NeuralNetworks` that get closer to this to be learning.
23    const TARGET:f64 = 10.0;
24
25    /// The margin of error for the networks.
26    const MARGIN:f64 = 0.1;
27
28    /// This is the `INPUT` for the networks, the goal being for them to "learn" how to take this number as
29    /// an input and output `TARGET`. When they output a value within `MARGIN` of `TARGET` that `NeuralNetwork`
30    /// will be considered to have reach the goal.
31    const INPUT:f64 = 1.0;
32
33    /// The maximum number of `GENERATIONS` that will be ran, less may be ran if a network gets within `margin` of
34    /// the target before `GENERATIONS` number of generations are ran.
35    const GENERATIONS:usize = 10_000;
36
37    // Create a `Vector` containing all `NeuralNetwork`s in the current generation using `batch_mutate()` and `NeuralNetwork::new()`
38    let mut networks:Vec<NeuralNetwork> = batch_mutate(5,0.25,&mut NeuralNetwork::new(1.0,1,3,1), true);
39
40    // This stores the closest value found by the network, it defaults to negative infinity.
41    let mut closest_value:f64 = f64::NEG_INFINITY;
42
43    // Get the current instant so that it can later be used to time how long it took to finish/fail
44    let time:Instant = Instant::now();
45
46    // For `generation` in `GENERATIONS` perform `batch_run()` and check if the networks are getting closer.
47    for generation in 0..GENERATIONS {
48        // Run the networks using `INPUT` as an input and store the output in `output`
49        let outputs:Vec<Vec<f64>> = batch_run(&mut networks, &vec![INPUT]);
50
51        // The `closest_network` used for creating the next generation.
52        let mut closest_network:usize = 0;
53
54        // Loop through every value in `outputs` checking to see if any of the outputs are within `MARGIN` of `TARGET`
55        // And use a range so that we can track the index of the output easily.
56        for output in 0..outputs.len() {
57            // Since the networks are only outputting a single value we can simply grab the first value of the `Vector`
58            // and check if thats within `MARGIN` of `TARGET` using a range.
59            if (TARGET-MARGIN..=TARGET+MARGIN).contains(&outputs[output][0]) {
60                // If true then print the value found by the network, the network itself, the current generation, and exit from the program.
61                println!("Finished in {:?}!\nGeneration: {:?}\nValue: {:?}\nNetwork: {{\nHiddenLayers: {:?}\nOutputLayer: {:?}\n}}", time.elapsed(),generation, outputs[output][0], networks[output].get_weights(), networks[output].get_output_weights());
62                // Exit code 0 on Linux means no problem, on Windows however this should be 256 but that is outside the scope of this example.
63                std::process::exit(0);
64            } else {
65                // If the `output` was not within `MARGIN` of `TARGET` then check if this value is closer to `TARGET` than the last `closest_value`.
66                // and set `closest_value` to `output` if it is closer.
67                if outputs[output][0] < TARGET && outputs[output][0] > closest_value {
68                    closest_value = outputs[output][0];
69                    closest_network = output;
70                } else if outputs[output][0] > TARGET && outputs[output][0] < closest_value {
71                    closest_value = outputs[output][0];
72                    closest_network = output;
73                }
74            }
75        }
76
77        // Set all `networks` to various mutations of the `closest_network`.
78        networks = batch_mutate(5, 0.25, &networks[closest_network], true);
79    }
80
81    // If we managed to get through `GENERATIONS` number of generations without getting within `MARGIN` of `TARGET` then output the `closest_value` we found.
82    println!("Failed to get within `MARGIN` within {:?} number of generations, this is the `closest_value` obtained: {:?}. In {:?}", GENERATIONS, closest_value, time.elapsed());
83}
Source

pub fn new_from( hidden_layers: Vec<Vec<f64>>, output_weights: Vec<f64>, ) -> NeuralNetwork

Creates a new NeuralNetwork using the inputs as the weights

Source

pub fn run(&mut self, inputs: &[f64]) -> Vec<f64>

Runs the NeuralNetwork using the provided arguments, then returns the output

Examples found in repository?
examples/load_save.rs (line 14)
10fn main() {
11    // Create a `NeuralNetwork`.
12    let mut network:NeuralNetwork = NeuralNetwork::new(1.0,1,3,1);
13    // Run the network and save the output.
14    let output:f64 = network.run(&vec![1.0])[0];
15    // Save it to file.
16    write_file("example.brain", &network).unwrap();
17    // Load it.
18    let mut loaded_network:NeuralNetwork = read_file("example.brain").unwrap();
19    // Call `run()` on the newly loaded `NeuralNetwork`.
20    let output2:f64 = loaded_network.run(&vec![1.0])[0];
21    // Show that the `NeuralNetwork` functions the same as it did before.
22    assert!(output == output2);
23}
Source

pub fn set_wheight( &mut self, wheight: f64, neuron: (usize, usize), ) -> Option<String>

Sets the wheight of a single neuron in the hidden layers. If the specified neuron is out of bounds then it will return an error in the form of a Option<String> This will contain text that be either outputted or ignored and simply checked if it exists.

use smarty_pants::neural_network::NeuralNetwork;

let mut brain:NeuralNetwork = NeuralNetwork::new(1.0,10,10,3);

match brain.set_wheight(64f64 ,(16usize,23usize)) {
    None => println!("No error"),
    Some(e) => println!("Error: {}", e)
}
Source

pub fn get_wheight(&self, neuron: (usize, usize)) -> Result<f64, Error>

Gets the wheight of a single neuron in the hidden_layers Returns an error if the specified neuron is greater than the bounds on the hidden_layers

use smarty_pants::neural_network::NeuralNetwork;

let mut brain:NeuralNetwork = NeuralNetwork::new(1.0,10,10,3);

match brain.get_wheight((16usize,23usize)) {
    Ok(_) => println!("No error"),
    Err(e) => println!("Error")
}
Source

pub fn mutate(&mut self, mutation_rate: f64, outputs: bool)

Mutates every wheight in the NeuralNetwork by a random amount that is a maximum of max in both the possitive and negative directions. It does this through addition and subtraction. if outputs is true then it will also mutate the output_weights

Source

pub fn batch_new( amount: usize, default_wheight: f64, hidden_layers: usize, hidden_neurons_per_layer: usize, outputs: usize, ) -> Vec<NeuralNetwork>

Returns a Vector containing amount number of neural networks all with the same starting values. This function does this by repeatedly calling NeuralNetwork::new() so it isn’t any more efficent, its simply here for convenience.

Source

pub fn get_weights(&self) -> Vec<Vec<f64>>

Returns the hidden_layers weights of the network.

Examples found in repository?
examples/simple_use_case.rs (line 61)
21fn main() {
22    /// This is our target, we will consider `NeuralNetworks` that get closer to this to be learning.
23    const TARGET:f64 = 10.0;
24
25    /// The margin of error for the networks.
26    const MARGIN:f64 = 0.1;
27
28    /// This is the `INPUT` for the networks, the goal being for them to "learn" how to take this number as
29    /// an input and output `TARGET`. When they output a value within `MARGIN` of `TARGET` that `NeuralNetwork`
30    /// will be considered to have reach the goal.
31    const INPUT:f64 = 1.0;
32
33    /// The maximum number of `GENERATIONS` that will be ran, less may be ran if a network gets within `margin` of
34    /// the target before `GENERATIONS` number of generations are ran.
35    const GENERATIONS:usize = 10_000;
36
37    // Create a `Vector` containing all `NeuralNetwork`s in the current generation using `batch_mutate()` and `NeuralNetwork::new()`
38    let mut networks:Vec<NeuralNetwork> = batch_mutate(5,0.25,&mut NeuralNetwork::new(1.0,1,3,1), true);
39
40    // This stores the closest value found by the network, it defaults to negative infinity.
41    let mut closest_value:f64 = f64::NEG_INFINITY;
42
43    // Get the current instant so that it can later be used to time how long it took to finish/fail
44    let time:Instant = Instant::now();
45
46    // For `generation` in `GENERATIONS` perform `batch_run()` and check if the networks are getting closer.
47    for generation in 0..GENERATIONS {
48        // Run the networks using `INPUT` as an input and store the output in `output`
49        let outputs:Vec<Vec<f64>> = batch_run(&mut networks, &vec![INPUT]);
50
51        // The `closest_network` used for creating the next generation.
52        let mut closest_network:usize = 0;
53
54        // Loop through every value in `outputs` checking to see if any of the outputs are within `MARGIN` of `TARGET`
55        // And use a range so that we can track the index of the output easily.
56        for output in 0..outputs.len() {
57            // Since the networks are only outputting a single value we can simply grab the first value of the `Vector`
58            // and check if thats within `MARGIN` of `TARGET` using a range.
59            if (TARGET-MARGIN..=TARGET+MARGIN).contains(&outputs[output][0]) {
60                // If true then print the value found by the network, the network itself, the current generation, and exit from the program.
61                println!("Finished in {:?}!\nGeneration: {:?}\nValue: {:?}\nNetwork: {{\nHiddenLayers: {:?}\nOutputLayer: {:?}\n}}", time.elapsed(),generation, outputs[output][0], networks[output].get_weights(), networks[output].get_output_weights());
62                // Exit code 0 on Linux means no problem, on Windows however this should be 256 but that is outside the scope of this example.
63                std::process::exit(0);
64            } else {
65                // If the `output` was not within `MARGIN` of `TARGET` then check if this value is closer to `TARGET` than the last `closest_value`.
66                // and set `closest_value` to `output` if it is closer.
67                if outputs[output][0] < TARGET && outputs[output][0] > closest_value {
68                    closest_value = outputs[output][0];
69                    closest_network = output;
70                } else if outputs[output][0] > TARGET && outputs[output][0] < closest_value {
71                    closest_value = outputs[output][0];
72                    closest_network = output;
73                }
74            }
75        }
76
77        // Set all `networks` to various mutations of the `closest_network`.
78        networks = batch_mutate(5, 0.25, &networks[closest_network], true);
79    }
80
81    // If we managed to get through `GENERATIONS` number of generations without getting within `MARGIN` of `TARGET` then output the `closest_value` we found.
82    println!("Failed to get within `MARGIN` within {:?} number of generations, this is the `closest_value` obtained: {:?}. In {:?}", GENERATIONS, closest_value, time.elapsed());
83}
Source

pub fn get_output_weights(&self) -> Vec<f64>

Returns the output_weights of the network.

Examples found in repository?
examples/simple_use_case.rs (line 61)
21fn main() {
22    /// This is our target, we will consider `NeuralNetworks` that get closer to this to be learning.
23    const TARGET:f64 = 10.0;
24
25    /// The margin of error for the networks.
26    const MARGIN:f64 = 0.1;
27
28    /// This is the `INPUT` for the networks, the goal being for them to "learn" how to take this number as
29    /// an input and output `TARGET`. When they output a value within `MARGIN` of `TARGET` that `NeuralNetwork`
30    /// will be considered to have reach the goal.
31    const INPUT:f64 = 1.0;
32
33    /// The maximum number of `GENERATIONS` that will be ran, less may be ran if a network gets within `margin` of
34    /// the target before `GENERATIONS` number of generations are ran.
35    const GENERATIONS:usize = 10_000;
36
37    // Create a `Vector` containing all `NeuralNetwork`s in the current generation using `batch_mutate()` and `NeuralNetwork::new()`
38    let mut networks:Vec<NeuralNetwork> = batch_mutate(5,0.25,&mut NeuralNetwork::new(1.0,1,3,1), true);
39
40    // This stores the closest value found by the network, it defaults to negative infinity.
41    let mut closest_value:f64 = f64::NEG_INFINITY;
42
43    // Get the current instant so that it can later be used to time how long it took to finish/fail
44    let time:Instant = Instant::now();
45
46    // For `generation` in `GENERATIONS` perform `batch_run()` and check if the networks are getting closer.
47    for generation in 0..GENERATIONS {
48        // Run the networks using `INPUT` as an input and store the output in `output`
49        let outputs:Vec<Vec<f64>> = batch_run(&mut networks, &vec![INPUT]);
50
51        // The `closest_network` used for creating the next generation.
52        let mut closest_network:usize = 0;
53
54        // Loop through every value in `outputs` checking to see if any of the outputs are within `MARGIN` of `TARGET`
55        // And use a range so that we can track the index of the output easily.
56        for output in 0..outputs.len() {
57            // Since the networks are only outputting a single value we can simply grab the first value of the `Vector`
58            // and check if thats within `MARGIN` of `TARGET` using a range.
59            if (TARGET-MARGIN..=TARGET+MARGIN).contains(&outputs[output][0]) {
60                // If true then print the value found by the network, the network itself, the current generation, and exit from the program.
61                println!("Finished in {:?}!\nGeneration: {:?}\nValue: {:?}\nNetwork: {{\nHiddenLayers: {:?}\nOutputLayer: {:?}\n}}", time.elapsed(),generation, outputs[output][0], networks[output].get_weights(), networks[output].get_output_weights());
62                // Exit code 0 on Linux means no problem, on Windows however this should be 256 but that is outside the scope of this example.
63                std::process::exit(0);
64            } else {
65                // If the `output` was not within `MARGIN` of `TARGET` then check if this value is closer to `TARGET` than the last `closest_value`.
66                // and set `closest_value` to `output` if it is closer.
67                if outputs[output][0] < TARGET && outputs[output][0] > closest_value {
68                    closest_value = outputs[output][0];
69                    closest_network = output;
70                } else if outputs[output][0] > TARGET && outputs[output][0] < closest_value {
71                    closest_value = outputs[output][0];
72                    closest_network = output;
73                }
74            }
75        }
76
77        // Set all `networks` to various mutations of the `closest_network`.
78        networks = batch_mutate(5, 0.25, &networks[closest_network], true);
79    }
80
81    // If we managed to get through `GENERATIONS` number of generations without getting within `MARGIN` of `TARGET` then output the `closest_value` we found.
82    println!("Failed to get within `MARGIN` within {:?} number of generations, this is the `closest_value` obtained: {:?}. In {:?}", GENERATIONS, closest_value, time.elapsed());
83}

Trait Implementations§

Source§

impl Clone for NeuralNetwork

Source§

fn clone(&self) -> NeuralNetwork

Returns a copy of the value. Read more
1.0.0 · Source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
Source§

impl Debug for NeuralNetwork

Source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more
Source§

impl<'de> Deserialize<'de> for NeuralNetwork

Source§

fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>
where __D: Deserializer<'de>,

Deserialize this value from the given Serde deserializer. Read more
Source§

impl PartialEq for NeuralNetwork

Source§

fn eq(&self, other: &NeuralNetwork) -> bool

Tests for self and other values to be equal, and is used by ==.
1.0.0 · Source§

fn ne(&self, other: &Rhs) -> bool

Tests for !=. The default implementation is almost always sufficient, and should not be overridden without very good reason.
Source§

impl Serialize for NeuralNetwork

Source§

fn serialize<__S>(&self, __serializer: __S) -> Result<__S::Ok, __S::Error>
where __S: Serializer,

Serialize this value into the given Serde serializer. Read more
Source§

impl StructuralPartialEq for NeuralNetwork

Auto Trait Implementations§

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> CloneToUninit for T
where T: Clone,

Source§

unsafe fn clone_to_uninit(&self, dest: *mut u8)

🔬This is a nightly-only experimental API. (clone_to_uninit)
Performs copy-assignment from self to dest. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T> ToOwned for T
where T: Clone,

Source§

type Owned = T

The resulting type after obtaining ownership.
Source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
Source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
Source§

impl<V, T> VZip<V> for T
where V: MultiLane<T>,

Source§

fn vzip(self) -> V

Source§

impl<T> DeserializeOwned for T
where T: for<'de> Deserialize<'de>,