Struct fann::FannTrainer
source · pub struct FannTrainer<'a> { /* private fields */ }
Expand description
A training configuration. Create this with Fann::on_data
or Fann::on_file
and run the
training with train
.
Implementations§
source§impl<'a> FannTrainer<'a>
impl<'a> FannTrainer<'a>
sourcepub fn with_reports(self, interval: c_uint) -> FannTrainer<'a>
pub fn with_reports(self, interval: c_uint) -> FannTrainer<'a>
Activates printing reports periodically. Between two reports, interval
neurons are added
(for cascade training) or training goes on for interval
epochs (otherwise).
sourcepub fn with_callback(
self,
interval: c_uint,
callback: &'a dyn Fn(&Fann, &TrainData, c_uint) -> CallbackResult
) -> FannTrainer<'a>
pub fn with_callback( self, interval: c_uint, callback: &'a dyn Fn(&Fann, &TrainData, c_uint) -> CallbackResult ) -> FannTrainer<'a>
Configures a callback to be called periodically during training. Every interval
epochs
(for regular training) or every time interval
new neurons have been added (for cascade
training), the callback runs. It receives as arguments:
- a reference to the current
Fann
, - a reference to the training data,
- the number of steps (added neurons or epochs) taken so far.
sourcepub fn cascade(self) -> FannTrainer<'a>
pub fn cascade(self) -> FannTrainer<'a>
Use the Cascade2 algorithm: This adds neurons to the neural network while training, starting with an ANN without any hidden layers. The network should use shortcut connections, so it needs to be created like this:
let td = fann::TrainData::from_file("test_files/xor.data").unwrap();
let fann = fann::Fann::new_shortcut(&[td.num_input(), td.num_output()]).unwrap();
sourcepub fn train(
&mut self,
max_steps: c_uint,
desired_error: c_float
) -> FannResult<()>
pub fn train( &mut self, max_steps: c_uint, desired_error: c_float ) -> FannResult<()>
Train the network until either the mean square error drops below the desired_error
, or
the maximum number of steps is reached. If cascade training is activated, max_steps
refers to the number of neurons that are added, otherwise it is the maximum number of
training epochs.