revonet
Rust implementation of real-coded genetic algorithm for solving optimization problems and training of neural networks. The latter is also known as neuroevolution.
Features:
- real-coded Evolutionary Algorithm
- NeuroEvolutionary tuning of weights of Neural Network with fixed structure
- supports several feed-forward architectures
- automatically computes statistics for single and multiple runs for EA and NE
- EA settings and results can be saved to json
- allows defining user-specified objective functions for EA and NE (see examples below)
Examples
Real-coded genetic algorithm
let pop_size = 20u32; // population size.
let problem_dim = 10u32; // number of optimization parameters.
let problem = RosenbrockProblem; // objective function.
let gen_count = 10u32; // generations number.
let settings = new;
let mut ga: = GA new; // init GA.
let res = ga.run.expect; // run and fetch the results.
// get and print results of the current run.
println!;
// make multiple runs and get combined results.
let res = ga.run_multiple.expect;
println!;
Run evolution of NN weights to solve regression problem
let = ; // gene_count does not matter here as NN structure is defined by a problem.
let settings = new;
let problem = new_f;
let mut ne: = NE new;
let res = ne.run.expect;
println!;
println!;
Creating multilayered neural network with 2 hidden layers with sigmoid activation and with linear output nodes.
const INPUT_SIZE: usize = 20;
const OUTPUT_SIZE: usize = 2;
let mut rng = thread_rng; // needed for weights initialization when NN is built.
let mut net: MultilayeredNetwork = new;
net.add_hidden_layer
.add_hidden_layer
.build; // `build` finishes creation of neural network.
let = net.get_weights; // `ws` and `bs` are `Vec` arrays containing weights and biases for each layer.
assert!; // number of elements equals to number of hidden layers + 1 output layer
assert!; // number of elements equals to number of hidden layers + 1 output layer
Creating custom optimization problem for GA
// Dummy problem returning random fitness.
;
Creating custom problem for NN evolution
// Dummy problem returning random fitness.