Skip to main content

DiagGradNutsSettings

Type Alias DiagGradNutsSettings 

Source
pub type DiagGradNutsSettings = DiagNutsSettings;
๐Ÿ‘ŽDeprecated since 0.0.0:

Use DiagNutsSettings instead

Expand description

Backwards-compatible alias for DiagNutsSettings.

Aliased Typeยง

pub struct DiagGradNutsSettings {
Show 16 fields pub num_tune: u64, pub num_draws: u64, pub maxdepth: u64, pub mindepth: u64, pub store_gradient: bool, pub store_unconstrained: bool, pub store_transformed: bool, pub max_energy_error: f64, pub store_divergences: bool, pub adapt_options: EuclideanAdaptOptions<DiagAdaptExpSettings>, pub check_turning: bool, pub target_integration_time: Option<f64>, pub trajectory_kind: KineticEnergyKind, pub num_chains: usize, pub seed: u64, pub extra_doublings: u64,
}

Fieldsยง

ยงnum_tune: u64

The number of tuning steps, where we fit the step size and geometry.

ยงnum_draws: u64

The number of draws after tuning

ยงmaxdepth: u64

The maximum tree depth during sampling. The number of leapfrog steps is smaller than 2 ^ maxdepth.

ยงmindepth: u64

The minimum tree depth during sampling. The number of leapfrog steps is larger than 2 ^ mindepth.

ยงstore_gradient: bool

Store the gradient in the SampleStats

ยงstore_unconstrained: bool

Store each unconstrained parameter vector in the sampler stats

ยงstore_transformed: bool

Store the transformed gradient and value in the sampler stats

ยงmax_energy_error: f64

If the energy error is larger than this threshold we treat the leapfrog step as a divergence.

ยงstore_divergences: bool

Store detailed information about each divergence in the sampler stats

ยงadapt_options: EuclideanAdaptOptions<DiagAdaptExpSettings>

Settings for geometry adaptation.

ยงcheck_turning: boolยงtarget_integration_time: Option<f64>ยงtrajectory_kind: KineticEnergyKind

Selects the kinetic-energy form and the corresponding integrator.

ยงnum_chains: usizeยงseed: u64ยงextra_doublings: u64

Number of extra doublings to perform after reaching maxdepth. This can be used to increase the effective sample size at the cost of more expensive sampling.