Skip to main content

Module functions

Module functions 

Source
Expand description

Auto-generated module

๐Ÿค– Generated with SplitRS

Functionsยง

abc_smc_consistency_ty
Approximate Bayesian Computation Consistency: ABC-SMC converges to the correct posterior as the tolerance ฮต โ†’ 0.
ais_unbiased_ty
Annealed Importance Sampling Unbiasedness: AIS produces unbiased estimates of the normalising constant.
app
app2
app3
arrow
bayes_measure_theory_ty
Measure-Theoretic Bayes: the posterior is the Radon-Nikodym derivative of the joint w.r.t. the marginal likelihood.
bool_ty
build_probabilistic_programming_env
Populate an Environment with all probabilistic-programming axiom declarations.
bvar
cst
density_ty
Density : Type โ†’ Type โ€” a density/pmf function on a type.
diffusion_score_matching_ty
Diffusion Model Score Matching: the reverse diffusion score function minimises the denoising score matching objective.
dsm_equals_sm_ty
Denoising Score Matching Objective: DSM objective equals implicit score matching objective under Gaussian noise.
elbo_lower_bound_ty
ELBO Lower Bound: ELBO(q) โ‰ค log p(x) for all variational families q.
elbo_ty
ELBO : (Type โ†’ Type) โ†’ Type โ†’ Real โ€” evidence lower bound for variational inference.
ep_fixed_point_ty
Expectation Propagation Fixed Point: EP converges when the cavity distribution and tilted distribution agree.
evol_mcmc_detailed_balance_ty
Evolutionary MCMC Detailed Balance: evolutionary MCMC satisfies detailed balance with respect to a product-form invariant distribution.
flow_matching_ode_ty
Flow Matching ODE Correctness: the conditional flow matching ODE generates the correct marginal distribution at time t=1.
gibbs_invariant_ty
Gibbs Sampling Invariance: the Gibbs sampler leaves the joint distribution invariant.
giry_monad_laws_ty
Giry Monad Laws: the distribution monad satisfies monad laws.
gp_marginal_gaussian_ty
GP Marginal Likelihood: the marginal likelihood of a GP is Gaussian.
gp_posterior_is_gp_ty
Gaussian Process Posterior: the posterior of a GP given observations is also a GP.
grad_log_normalizer_ty
Gradient of Log Normalizer: the gradient of the log normalizer of an exponential family equals the mean of the sufficient statistics.
gradient_estimator_ty
GradientEstimator : Type โ†’ Type โ€” a Monte Carlo gradient estimator.
hmc_invariant_ty
HMC Correctness: Hamiltonian Monte Carlo leaves the target distribution invariant.
importance_weight_ty
ImportanceWeight : Type โ†’ Real โ€” self-normalised importance weight.
is_consistency_ty
Importance Sampling Consistency: the IS estimator is consistent as Nโ†’โˆž.
kde_consistency_ty
Kernel Density Estimation Consistency: the KDE converges to the true density in L2 as n โ†’ โˆž with optimal bandwidth.
kernel_ty
Kernel : Type โ†’ Type โ†’ Type โ€” a Markov kernel k(x, A).
langevin_convergence_ty
Langevin Dynamics Convergence: the unadjusted Langevin algorithm (ULA) converges to the target in 2-Wasserstein under strong convexity.
list_ty
mean_field_cavi_ty
Variational Inference Mean-Field Factorization: the mean-field approximation optimises each factor holding others fixed via coordinate ascent.
measurable_space_ty
MeasurableSpace : Type โ€” a type equipped with a ฯƒ-algebra.
measure_transport_exists_ty
Measure Transport Existence: for any two probability measures with the same total mass there exists a measurable transport map.
measure_ty
Measure : Type โ†’ Type โ€” a ฯƒ-finite measure on a measurable space.
mh_detailed_balance_ty
Metropolis-Hastings Detailed Balance: MH kernel satisfies detailed balance w.r.t. the target distribution.
nat_ty
nested_mc_bias_ty
Nested Monte Carlo Estimator Bias: nested MC estimators are biased but consistent as the inner sample size grows.
normalizing_flow_cov_ty
Normalizing Flow Change of Variables: the pushforward density satisfies the change-of-variables formula.
ot_kantorovich_ty
Optimal Transport Kantorovich Duality: the Wasserstein-1 distance equals the supremum of Lipschitz-1 functions.
parallel_tempering_exchange_ty
Parallel Tempering Exchange Correctness: the swap move in parallel tempering preserves the joint invariant distribution.
particle_filter_ty
ParticleFilter : Type โ†’ Type โ€” sequential Monte Carlo state estimator.
pathwise_gradient_unbiased_ty
Pathwise Gradient Unbiasedness: the reparameterised (pathwise) gradient is an unbiased estimator when the reparameterisation is differentiable.
pbp_gaussian_propagation_ty
Probabilistic Backpropagation Gaussian Propagation: PBP propagates a Gaussian approximation through each layer of a neural network.
pi
pmc_consistency_ty
Population Monte Carlo Consistency: PMC estimators are consistent as population size and iterations grow.
pmmh_correctness_ty
Particle Marginal Metropolis-Hastings Correctness: PMMH targeting the exact posterior is asymptotically exact.
pn_integration_ty
Probabilistic Numerics Integration: Bayesian quadrature produces a posterior over integrals.
ppl_program_ty
PPLProgram : Type โ†’ Type โ€” a probabilistic program returning values of type A.
probability_monad_ty
ProbabilityMonad : (Type โ†’ Type) โ€” the distribution / Giry monad.
prop
real_ty
reparam_unbiased_ty
Reparameterisation Gradient: the reparameterised gradient estimator is an unbiased estimator of โˆ‡ฯ† E{z~q_ฯ†}[f(z)].
sampler_ty
Sampler : Type โ†’ Type โ€” a procedure that draws samples from a distribution.
score_fn_unbiased_ty
Score Function Estimator Unbiasedness: the REINFORCE estimator is an unbiased gradient estimator under mild regularity conditions.
sigma_algebra_ty
SigmaAlgebra : Type โ†’ Type โ€” a ฯƒ-algebra of subsets.
simulated_annealing_convergence_ty
Simulated Annealing Convergence: simulated annealing converges to a global optimum under a logarithmic cooling schedule.
smc_consistency_ty
SMC Consistency: sequential Monte Carlo converges to the true filtering distribution.
smc_feynman_kac_ty
SMC Feynman-Kac: SMC computes the Feynman-Kac normalising constant exactly in expectation.
smc_genealogy_ty
Sequential Monte Carlo Genealogy: the ancestral lineage in SMC traces back through the resampling steps.
stein_disc_zero_iff_ty
Stein Discrepancy Zero Iff Same Distribution: the kernel Stein discrepancy between two measures is zero if and only if they are equal.
stein_identity_ty
Stein Identity: for any smooth function h and score function s_p = โˆ‡ log p, E_p[โˆ‡ h(x) + h(x) s_p(x)] = 0.
svgd_convergence_ty
Stein Variational Gradient Descent Convergence: SVGD converges to the target distribution in the Stein discrepancy sense.
svi_convergence_ty
Stochastic Variational Inference (SVI) Convergence: SVI converges to a local ELBO maximum.
type0
type1
vae_elbo_decomp_ty
VAE ELBO Decomposition: the VAE objective decomposes as reconstruction term minus KL divergence.
vae_posterior_collapse_risk_ty
Variational Autoencoder Posterior Collapse: with a sufficiently expressive decoder there exists a risk of posterior collapse.