231,923 research outputs found

    Bayesian computation via empirical likelihood

    Full text link
    Approximate Bayesian computation (ABC) has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the ABC parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The BCel algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.Comment: 21 pages, 12 figures, revised version of the previous version with a new titl

    On computational tools for Bayesian data analysis

    Full text link
    While Robert and Rousseau (2010) addressed the foundational aspects of Bayesian analysis, the current chapter details its practical aspects through a review of the computational methods available for approximating Bayesian procedures. Recent innovations like Monte Carlo Markov chain, sequential Monte Carlo methods and more recently Approximate Bayesian Computation techniques have considerably increased the potential for Bayesian applications and they have also opened new avenues for Bayesian inference, first and foremost Bayesian model choice.Comment: This is a chapter for the book "Bayesian Methods and Expert Elicitation" edited by Klaus Bocker, 23 pages, 9 figure

    Approximate Bayesian Computation by Subset Simulation

    Get PDF
    A new Approximate Bayesian Computation (ABC) algorithm for Bayesian updating of model parameters is proposed in this paper, which combines the ABC principles with the technique of Subset Simulation for efficient rare-event simulation, first developed in S.K. Au and J.L. Beck [1]. It has been named ABC- SubSim. The idea is to choose the nested decreasing sequence of regions in Subset Simulation as the regions that correspond to increasingly closer approximations of the actual data vector in observation space. The efficiency of the algorithm is demonstrated in two examples that illustrate some of the challenges faced in real-world applications of ABC. We show that the proposed algorithm outperforms other recent sequential ABC algorithms in terms of computational efficiency while achieving the same, or better, measure of ac- curacy in the posterior distribution. We also show that ABC-SubSim readily provides an estimate of the evidence (marginal likelihood) for posterior model class assessment, as a by-product

    Inverse Problems in a Bayesian Setting

    Full text link
    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) --- the propagation of uncertainty through a computational (forward) model --- are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.Comment: arXiv admin note: substantial text overlap with arXiv:1312.504

    Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems

    Full text link
    Approximate Bayesian computation methods can be used to evaluate posterior distributions without having to calculate likelihoods. In this paper we discuss and apply an approximate Bayesian computation (ABC) method based on sequential Monte Carlo (SMC) to estimate parameters of dynamical models. We show that ABC SMC gives information about the inferability of parameters and model sensitivity to changes in parameters, and tends to perform better than other ABC approaches. The algorithm is applied to several well known biological systems, for which parameters and their credible intervals are inferred. Moreover, we develop ABC SMC as a tool for model selection; given a range of different mathematical descriptions, ABC SMC is able to choose the best model using the standard Bayesian model selection apparatus.Comment: 26 pages, 9 figure

    Regression approaches for Approximate Bayesian Computation

    Get PDF
    This book chapter introduces regression approaches and regression adjustment for Approximate Bayesian Computation (ABC). Regression adjustment adjusts parameter values after rejection sampling in order to account for the imperfect match between simulations and observations. Imperfect match between simulations and observations can be more pronounced when there are many summary statistics, a phenomenon coined as the curse of dimensionality. Because of this imperfect match, credibility intervals obtained with regression approaches can be inflated compared to true credibility intervals. The chapter presents the main concepts underlying regression adjustment. A theorem that compares theoretical properties of posterior distributions obtained with and without regression adjustment is presented. Last, a practical application of regression adjustment in population genetics shows that regression adjustment shrinks posterior distributions compared to rejection approaches, which is a solution to avoid inflated credibility intervals.Comment: Book chapter, published in Handbook of Approximate Bayesian Computation 201
    corecore