Crate neuralneat
source ·Expand description
neuralneat
is an implementation of the NeuroEvolution of Augmenting
Topologies
(NEAT) described in the June 2002 paper titled “Evolving Neural
Networks through Augmenting Topologies” by Kenneth O. Stanley and Risto
Miikkulainen.
Much of this implementation was also guided by the NEAT 1.2.1 source code.
Basic usage:
use neuralneat::{Genome, Pool, Trainer};
use neuralneat::evaluation::TrainingData;
// To do something useful, you need to decide what your training data is!
fn load_training_data() -> Vec<TrainingData> {
return vec![];
}
fn main() {
let input_nodes = 5;
let output_nodes = 1;
// Create an initial pool of Genomes
let mut gene_pool = Pool::with_defaults(input_nodes, output_nodes);
// Load the data that will be used to train and evolve the Genomes
let training_data: Vec<TrainingData> = load_training_data();
// A Trainer can manage the process of training a population of Genomes
// over successive generations.
let mut trainer = Trainer::new(training_data);
trainer.train(
&mut gene_pool,
// Train for 100 generations
100,
);
// The winner!
let best_genome = gene_pool.get_best_genome();
}
Modules
- The evaluation module contains functions and types related to the process of evaluating and activating the neural network of a Genome, such as basic activation functions that can be used with hidden and output layers of a Genome.
Structs
- A Genome contains a network of Network of Neurons and Genes that when given the same number of inputs the containing Pool was given, will produce a Vec with the same number of outputs that the Pool was given.
- Basic statistics about a Genome’s fitness.
- A Trainer will manage the training cycle for a population of Genomes.