75,772 research outputs found
Two adaptive rejection sampling schemes for probability density functions log-convex tails
Monte Carlo methods are often necessary for the implementation of optimal
Bayesian estimators. A fundamental technique that can be used to generate
samples from virtually any target probability distribution is the so-called
rejection sampling method, which generates candidate samples from a proposal
distribution and then accepts them or not by testing the ratio of the target
and proposal densities. The class of adaptive rejection sampling (ARS)
algorithms is particularly interesting because they can achieve high acceptance
rates. However, the standard ARS method can only be used with log-concave
target densities. For this reason, many generalizations have been proposed.
In this work, we investigate two different adaptive schemes that can be used
to draw exactly from a large family of univariate probability density functions
(pdf's), not necessarily log-concave, possibly multimodal and with tails of
arbitrary concavity. These techniques are adaptive in the sense that every time
a candidate sample is rejected, the acceptance rate is improved. The two
proposed algorithms can work properly when the target pdf is multimodal, with
first and second derivatives analytically intractable, and when the tails are
log-convex in a infinite domain. Therefore, they can be applied in a number of
scenarios in which the other generalizations of the standard ARS fail. Two
illustrative numerical examples are shown
An Exact Auxiliary Variable Gibbs Sampler for a Class of Diffusions
Stochastic differential equations (SDEs) or diffusions are continuous-valued
continuous-time stochastic processes widely used in the applied and
mathematical sciences. Simulating paths from these processes is usually an
intractable problem, and typically involves time-discretization approximations.
We propose an exact Markov chain Monte Carlo sampling algorithm that involves
no such time-discretization error. Our sampler is applicable to the problem of
prior simulation from an SDE, posterior simulation conditioned on noisy
observations, as well as parameter inference given noisy observations. Our work
recasts an existing rejection sampling algorithm for a class of diffusions as a
latent variable model, and then derives an auxiliary variable Gibbs sampling
algorithm that targets the associated joint distribution. At a high level, the
resulting algorithm involves two steps: simulating a random grid of times from
an inhomogeneous Poisson process, and updating the SDE trajectory conditioned
on this grid. Our work allows the vast literature of Monte Carlo sampling
algorithms from the Gaussian process literature to be brought to bear to
applications involving diffusions. We study our method on synthetic and real
datasets, where we demonstrate superior performance over competing methods.Comment: 37 pages, 13 figure
Generalized Geometric Cluster Algorithm for Fluid Simulation
We present a detailed description of the generalized geometric cluster
algorithm for the efficient simulation of continuum fluids. The connection with
well-known cluster algorithms for lattice spin models is discussed, and an
explicit full cluster decomposition is derived for a particle configuration in
a fluid. We investigate a number of basic properties of the geometric cluster
algorithm, including the dependence of the cluster-size distribution on density
and temperature. Practical aspects of its implementation and possible
extensions are discussed. The capabilities and efficiency of our approach are
illustrated by means of two example studies.Comment: Accepted for publication in Phys. Rev. E. Follow-up to
cond-mat/041274
Non-linear regression models for Approximate Bayesian Computation
Approximate Bayesian inference on the basis of summary statistics is
well-suited to complex problems for which the likelihood is either
mathematically or computationally intractable. However the methods that use
rejection suffer from the curse of dimensionality when the number of summary
statistics is increased. Here we propose a machine-learning approach to the
estimation of the posterior density by introducing two innovations. The new
method fits a nonlinear conditional heteroscedastic regression of the parameter
on the summary statistics, and then adaptively improves estimation using
importance sampling. The new algorithm is compared to the state-of-the-art
approximate Bayesian methods, and achieves considerable reduction of the
computational burden in two examples of inference in statistical genetics and
in a queueing model.Comment: 4 figures; version 3 minor changes; to appear in Statistics and
Computin
Markov chain Monte Carlo for exact inference for diffusions
We develop exact Markov chain Monte Carlo methods for discretely-sampled,
directly and indirectly observed diffusions. The qualification "exact" refers
to the fact that the invariant and limiting distribution of the Markov chains
is the posterior distribution of the parameters free of any discretisation
error. The class of processes to which our methods directly apply are those
which can be simulated using the most general to date exact simulation
algorithm. The article introduces various methods to boost the performance of
the basic scheme, including reparametrisations and auxiliary Poisson sampling.
We contrast both theoretically and empirically how this new approach compares
to irreducible high frequency imputation, which is the state-of-the-art
alternative for the class of processes we consider, and we uncover intriguing
connections. All methods discussed in the article are tested on typical
examples.Comment: 23 pages, 6 Figures, 3 Table
- …