Struct dfdx::optim::Adam

source · []
pub struct Adam {
    pub lr: f32,
    pub betas: [f32; 2],
    pub eps: f32,
    /* private fields */
}
Expand description

An implementation of the Adam optimizer from Adam: A Method for Stochastic Optimization

Example Usage:

let mut t = Tensor0D::ones();
let mut opt: Adam = Default::default();
let gradients = t.trace().backward();
opt.update(&mut t, gradients);

Changing default parmeters:

let adam = Adam::new(1e-2, [0.5, 0.25], 1e-6);

Fields

lr: f32

Learning rate

betas: [f32; 2]

Betas from Adam paper

eps: f32

Epsilon for numerical stability

Implementations

Construct with control over all fields.

Trait Implementations

Formats the value using the given formatter. Read more

Use the default parameters suggested in the paper of lr=1e-3, beta1=0.9, beta2=0.999, and epsilon=1e-8

Retrieves the data associated with p if there is any. This can modify self, for instance if velocities are calculated based on the associated data! Read more

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Should always be Self

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.