Expand description
Auto-generated module
๐ค Generated with SplitRS
Functionsยง
- abc_
smc_ consistency_ ty - Approximate Bayesian Computation Consistency: ABC-SMC converges to the correct posterior as the tolerance ฮต โ 0.
- ais_
unbiased_ ty - Annealed Importance Sampling Unbiasedness: AIS produces unbiased estimates of the normalising constant.
- app
- app2
- app3
- arrow
- bayes_
measure_ theory_ ty - Measure-Theoretic Bayes: the posterior is the Radon-Nikodym derivative of the joint w.r.t. the marginal likelihood.
- bool_ty
- build_
probabilistic_ programming_ env - Populate an
Environmentwith all probabilistic-programming axiom declarations. - bvar
- cst
- density_
ty Density : Type โ Typeโ a density/pmf function on a type.- diffusion_
score_ matching_ ty - Diffusion Model Score Matching: the reverse diffusion score function minimises the denoising score matching objective.
- dsm_
equals_ sm_ ty - Denoising Score Matching Objective: DSM objective equals implicit score matching objective under Gaussian noise.
- elbo_
lower_ bound_ ty - ELBO Lower Bound: ELBO(q) โค log p(x) for all variational families q.
- elbo_ty
ELBO : (Type โ Type) โ Type โ Realโ evidence lower bound for variational inference.- ep_
fixed_ point_ ty - Expectation Propagation Fixed Point: EP converges when the cavity distribution and tilted distribution agree.
- evol_
mcmc_ detailed_ balance_ ty - Evolutionary MCMC Detailed Balance: evolutionary MCMC satisfies detailed balance with respect to a product-form invariant distribution.
- flow_
matching_ ode_ ty - Flow Matching ODE Correctness: the conditional flow matching ODE generates the correct marginal distribution at time t=1.
- gibbs_
invariant_ ty - Gibbs Sampling Invariance: the Gibbs sampler leaves the joint distribution invariant.
- giry_
monad_ laws_ ty - Giry Monad Laws: the distribution monad satisfies monad laws.
- gp_
marginal_ gaussian_ ty - GP Marginal Likelihood: the marginal likelihood of a GP is Gaussian.
- gp_
posterior_ is_ gp_ ty - Gaussian Process Posterior: the posterior of a GP given observations is also a GP.
- grad_
log_ normalizer_ ty - Gradient of Log Normalizer: the gradient of the log normalizer of an exponential family equals the mean of the sufficient statistics.
- gradient_
estimator_ ty GradientEstimator : Type โ Typeโ a Monte Carlo gradient estimator.- hmc_
invariant_ ty - HMC Correctness: Hamiltonian Monte Carlo leaves the target distribution invariant.
- importance_
weight_ ty ImportanceWeight : Type โ Realโ self-normalised importance weight.- is_
consistency_ ty - Importance Sampling Consistency: the IS estimator is consistent as Nโโ.
- kde_
consistency_ ty - Kernel Density Estimation Consistency: the KDE converges to the true density in L2 as n โ โ with optimal bandwidth.
- kernel_
ty Kernel : Type โ Type โ Typeโ a Markov kernel k(x, A).- langevin_
convergence_ ty - Langevin Dynamics Convergence: the unadjusted Langevin algorithm (ULA) converges to the target in 2-Wasserstein under strong convexity.
- list_ty
- mean_
field_ cavi_ ty - Variational Inference Mean-Field Factorization: the mean-field approximation optimises each factor holding others fixed via coordinate ascent.
- measurable_
space_ ty MeasurableSpace : Typeโ a type equipped with a ฯ-algebra.- measure_
transport_ exists_ ty - Measure Transport Existence: for any two probability measures with the same total mass there exists a measurable transport map.
- measure_
ty Measure : Type โ Typeโ a ฯ-finite measure on a measurable space.- mh_
detailed_ balance_ ty - Metropolis-Hastings Detailed Balance: MH kernel satisfies detailed balance w.r.t. the target distribution.
- nat_ty
- nested_
mc_ bias_ ty - Nested Monte Carlo Estimator Bias: nested MC estimators are biased but consistent as the inner sample size grows.
- normalizing_
flow_ cov_ ty - Normalizing Flow Change of Variables: the pushforward density satisfies the change-of-variables formula.
- ot_
kantorovich_ ty - Optimal Transport Kantorovich Duality: the Wasserstein-1 distance equals the supremum of Lipschitz-1 functions.
- parallel_
tempering_ exchange_ ty - Parallel Tempering Exchange Correctness: the swap move in parallel tempering preserves the joint invariant distribution.
- particle_
filter_ ty ParticleFilter : Type โ Typeโ sequential Monte Carlo state estimator.- pathwise_
gradient_ unbiased_ ty - Pathwise Gradient Unbiasedness: the reparameterised (pathwise) gradient is an unbiased estimator when the reparameterisation is differentiable.
- pbp_
gaussian_ propagation_ ty - Probabilistic Backpropagation Gaussian Propagation: PBP propagates a Gaussian approximation through each layer of a neural network.
- pi
- pmc_
consistency_ ty - Population Monte Carlo Consistency: PMC estimators are consistent as population size and iterations grow.
- pmmh_
correctness_ ty - Particle Marginal Metropolis-Hastings Correctness: PMMH targeting the exact posterior is asymptotically exact.
- pn_
integration_ ty - Probabilistic Numerics Integration: Bayesian quadrature produces a posterior over integrals.
- ppl_
program_ ty PPLProgram : Type โ Typeโ a probabilistic program returning values of type A.- probability_
monad_ ty ProbabilityMonad : (Type โ Type)โ the distribution / Giry monad.- prop
- real_ty
- reparam_
unbiased_ ty - Reparameterisation Gradient: the reparameterised gradient estimator is an unbiased estimator of โฯ E{z~q_ฯ}[f(z)].
- sampler_
ty Sampler : Type โ Typeโ a procedure that draws samples from a distribution.- score_
fn_ unbiased_ ty - Score Function Estimator Unbiasedness: the REINFORCE estimator is an unbiased gradient estimator under mild regularity conditions.
- sigma_
algebra_ ty SigmaAlgebra : Type โ Typeโ a ฯ-algebra of subsets.- simulated_
annealing_ convergence_ ty - Simulated Annealing Convergence: simulated annealing converges to a global optimum under a logarithmic cooling schedule.
- smc_
consistency_ ty - SMC Consistency: sequential Monte Carlo converges to the true filtering distribution.
- smc_
feynman_ kac_ ty - SMC Feynman-Kac: SMC computes the Feynman-Kac normalising constant exactly in expectation.
- smc_
genealogy_ ty - Sequential Monte Carlo Genealogy: the ancestral lineage in SMC traces back through the resampling steps.
- stein_
disc_ zero_ iff_ ty - Stein Discrepancy Zero Iff Same Distribution: the kernel Stein discrepancy between two measures is zero if and only if they are equal.
- stein_
identity_ ty - Stein Identity: for any smooth function h and score function s_p = โ log p, E_p[โ h(x) + h(x) s_p(x)] = 0.
- svgd_
convergence_ ty - Stein Variational Gradient Descent Convergence: SVGD converges to the target distribution in the Stein discrepancy sense.
- svi_
convergence_ ty - Stochastic Variational Inference (SVI) Convergence: SVI converges to a local ELBO maximum.
- type0
- type1
- vae_
elbo_ decomp_ ty - VAE ELBO Decomposition: the VAE objective decomposes as reconstruction term minus KL divergence.
- vae_
posterior_ collapse_ risk_ ty - Variational Autoencoder Posterior Collapse: with a sufficiently expressive decoder there exists a risk of posterior collapse.