Trait border_core::core::base::Agent[][src]

pub trait Agent<E: Env>: Policy<E> {
    fn train(&mut self);
fn eval(&mut self);
fn is_train(&self) -> bool;
fn observe(&mut self, step: Step<E>) -> Option<Record>;
fn push_obs(&self, obs: &E::Obs);
fn save<T: AsRef<Path>>(&self, path: T) -> Result<(), Box<dyn Error>>;
fn load<T: AsRef<Path>>(&mut self, path: T) -> Result<(), Box<dyn Error>>; }
Expand description

Represents a trainable policy on an environment.

Required methods

fn train(&mut self)[src]

Set the policy to training mode.

fn eval(&mut self)[src]

Set the policy to evaluation mode.

fn is_train(&self) -> bool[src]

Return if it is in training mode.

fn observe(&mut self, step: Step<E>) -> Option<Record>[src]

Observe a crate::core::base::Step object. The agent is expected to do training its policy based on the observation.

If an optimization step was performed, it returns Some(crate::core::record::Record), otherwise None.

fn push_obs(&self, obs: &E::Obs)[src]

Push observation to the agent. This method is used when resetting the environment.

fn save<T: AsRef<Path>>(&self, path: T) -> Result<(), Box<dyn Error>>[src]

Save the agent in the given directory. This method commonly creates a number of files consisting the agent into the given directory. For example, [crate::agent::tch::dqn::DQN] agent saves two Q-networks corresponding to the original and target networks.

fn load<T: AsRef<Path>>(&mut self, path: T) -> Result<(), Box<dyn Error>>[src]

Load the agent from the given directory.

Implementors