71,584 research outputs found
Patterns of Scalable Bayesian Inference
Datasets are growing not just in size but in complexity, creating a demand
for rich models and quantification of uncertainty. Bayesian methods are an
excellent fit for this demand, but scaling Bayesian inference is a challenge.
In response to this challenge, there has been considerable recent work based on
varying assumptions about model structure, underlying computational resources,
and the importance of asymptotic correctness. As a result, there is a zoo of
ideas with few clear overarching principles.
In this paper, we seek to identify unifying principles, patterns, and
intuitions for scaling Bayesian inference. We review existing work on utilizing
modern computing resources with both MCMC and variational approximation
techniques. From this taxonomy of ideas, we characterize the general principles
that have proven successful for designing scalable inference procedures and
comment on the path forward
Distances between nested densities and a measure of the impact of the prior in Bayesian statistics
In this paper we propose tight upper and lower bounds for the Wasserstein
distance between any two {{univariate continuous distributions}} with
probability densities and having nested supports. These explicit
bounds are expressed in terms of the derivative of the likelihood ratio
as well as the Stein kernel of . The method of proof
relies on a new variant of Stein's method which manipulates Stein operators.
We give several applications of these bounds. Our main application is in
Bayesian statistics : we derive explicit data-driven bounds on the Wasserstein
distance between the posterior distribution based on a given prior and the
no-prior posterior based uniquely on the sampling distribution. This is the
first finite sample result confirming the well-known fact that with
well-identified parameters and large sample sizes, reasonable choices of prior
distributions will have only minor effects on posterior inferences if the data
are benign
- …