4,786 research outputs found

    Simulation in Statistics

    Full text link
    Simulation has become a standard tool in statistics because it may be the only tool available for analysing some classes of probabilistic models. We review in this paper simulation tools that have been specifically derived to address statistical challenges and, in particular, recent advances in the areas of adaptive Markov chain Monte Carlo (MCMC) algorithms, and approximate Bayesian calculation (ABC) algorithms.Comment: Draft of an advanced tutorial paper for the Proceedings of the 2011 Winter Simulation Conferenc

    Bayesian Optimization for Adaptive MCMC

    Full text link
    This paper proposes a new randomized strategy for adaptive MCMC using Bayesian optimization. This approach applies to non-differentiable objective functions and trades off exploration and exploitation to reduce the number of potentially costly objective function evaluations. We demonstrate the strategy in the complex setting of sampling from constrained, discrete and densely connected probabilistic graphical models where, for each variation of the problem, one needs to adjust the parameters of the proposal mechanism automatically to ensure efficient mixing of the Markov chains.Comment: This paper contains 12 pages and 6 figures. A similar version of this paper has been submitted to AISTATS 2012 and is currently under revie

    Efficient Gaussian Sampling for Solving Large-Scale Inverse Problems using MCMC Methods

    Get PDF
    The resolution of many large-scale inverse problems using MCMC methods requires a step of drawing samples from a high dimensional Gaussian distribution. While direct Gaussian sampling techniques, such as those based on Cholesky factorization, induce an excessive numerical complexity and memory requirement, sequential coordinate sampling methods present a low rate of convergence. Based on the reversible jump Markov chain framework, this paper proposes an efficient Gaussian sampling algorithm having a reduced computation cost and memory usage. The main feature of the algorithm is to perform an approximate resolution of a linear system with a truncation level adjusted using a self-tuning adaptive scheme allowing to achieve the minimal computation cost. The connection between this algorithm and some existing strategies is discussed and its efficiency is illustrated on a linear inverse problem of image resolution enhancement.Comment: 20 pages, 10 figures, under review for journal publicatio

    Adaptive MC^3 and Gibbs algorithms for Bayesian Model Averaging in Linear Regression Models

    Full text link
    The MC3^3 (Madigan and York, 1995) and Gibbs (George and McCulloch, 1997) samplers are the most widely implemented algorithms for Bayesian Model Averaging (BMA) in linear regression models. These samplers draw a variable at random in each iteration using uniform selection probabilities and then propose to update that variable. This may be computationally inefficient if the number of variables is large and many variables are redundant. In this work, we introduce adaptive versions of these samplers that retain their simplicity in implementation and reduce the selection probabilities of the many redundant variables. The improvements in efficiency for the adaptive samplers are illustrated in real and simulated datasets

    Efficient computational strategies for doubly intractable problems with applications to Bayesian social networks

    Get PDF
    Powerful ideas recently appeared in the literature are adjusted and combined to design improved samplers for Bayesian exponential random graph models. Different forms of adaptive Metropolis-Hastings proposals (vertical, horizontal and rectangular) are tested and combined with the Delayed rejection (DR) strategy with the aim of reducing the variance of the resulting Markov chain Monte Carlo estimators for a given computational time. In the examples treated in this paper the best combination, namely horizontal adaptation with delayed rejection, leads to a variance reduction that varies between 92% and 144% relative to the adaptive direction sampling approximate exchange algorithm of Caimo and Friel (2011). These results correspond to an increased performance which varies from 10% to 94% if we take simulation time into account. The highest improvements are obtained when highly correlated posterior distributions are considered.Comment: 23 pages, 8 figures. Accepted to appear in Statistics and Computin

    Free energy Sequential Monte Carlo, application to mixture modelling

    Full text link
    We introduce a new class of Sequential Monte Carlo (SMC) methods, which we call free energy SMC. This class is inspired by free energy methods, which originate from Physics, and where one samples from a biased distribution such that a given function ξ(θ)\xi(\theta) of the state θ\theta is forced to be uniformly distributed over a given interval. From an initial sequence of distributions (πt)(\pi_t) of interest, and a particular choice of ξ(θ)\xi(\theta), a free energy SMC sampler computes sequentially a sequence of biased distributions (π~t)(\tilde{\pi}_{t}) with the following properties: (a) the marginal distribution of ξ(θ)\xi(\theta) with respect to π~t\tilde{\pi}_{t} is approximatively uniform over a specified interval, and (b) π~t\tilde{\pi}_{t} and πt\pi_{t} have the same conditional distribution with respect to ξ\xi. We apply our methodology to mixture posterior distributions, which are highly multimodal. In the mixture context, forcing certain hyper-parameters to higher values greatly faciliates mode swapping, and makes it possible to recover a symetric output. We illustrate our approach with univariate and bivariate Gaussian mixtures and two real-world datasets.Comment: presented at "Bayesian Statistics 9" (Valencia meetings, 4-8 June 2010, Benidorm

    Grapham: Graphical Models with Adaptive Random Walk Metropolis Algorithms

    Full text link
    Recently developed adaptive Markov chain Monte Carlo (MCMC) methods have been applied successfully to many problems in Bayesian statistics. Grapham is a new open source implementation covering several such methods, with emphasis on graphical models for directed acyclic graphs. The implemented algorithms include the seminal Adaptive Metropolis algorithm adjusting the proposal covariance according to the history of the chain and a Metropolis algorithm adjusting the proposal scale based on the observed acceptance probability. Different variants of the algorithms allow one, for example, to use these two algorithms together, employ delayed rejection and adjust several parameters of the algorithms. The implemented Metropolis-within-Gibbs update allows arbitrary sampling blocks. The software is written in C and uses a simple extension language Lua in configuration.Comment: 9 pages, 3 figures; added references, revised language, other minor change
    • …
    corecore