375 research outputs found

    Accelerating inference for stochastic kinetic models

    Full text link
    Stochastic kinetic models (SKMs) are increasingly used to account for the inherent stochasticity exhibited by interacting populations of species in areas such as epidemiology, population ecology and systems biology. Species numbers are modelled using a continuous-time stochastic process, and, depending on the application area of interest, this will typically take the form of a Markov jump process or an It\^o diffusion process. Widespread use of these models is typically precluded by their computational complexity. In particular, performing exact fully Bayesian inference in either modelling framework is challenging due to the intractability of the observed data likelihood, necessitating the use of computationally intensive techniques such as particle Markov chain Monte Carlo (particle MCMC). It is proposed to increase the computational and statistical efficiency of this approach by leveraging the tractability of an inexpensive surrogate derived directly from either the jump or diffusion process. The surrogate is used in three ways: in the design of a gradient-based parameter proposal, to construct an appropriate bridge and in the first stage of a delayed-acceptance step. The resulting approach, which exactly targets the posterior of interest, offers substantial gains in efficiency over a standard particle MCMC implementation.Comment: 29 page

    Accelerating delayed-acceptance Markov chain Monte Carlo algorithms

    Full text link
    Delayed-acceptance Markov chain Monte Carlo (DA-MCMC) samples from a probability distribution via a two-stages version of the Metropolis-Hastings algorithm, by combining the target distribution with a "surrogate" (i.e. an approximate and computationally cheaper version) of said distribution. DA-MCMC accelerates MCMC sampling in complex applications, while still targeting the exact distribution. We design a computationally faster, albeit approximate, DA-MCMC algorithm. We consider parameter inference in a Bayesian setting where a surrogate likelihood function is introduced in the delayed-acceptance scheme. When the evaluation of the likelihood function is computationally intensive, our scheme produces a 2-4 times speed-up, compared to standard DA-MCMC. However, the acceleration is highly problem dependent. Inference results for the standard delayed-acceptance algorithm and our approximated version are similar, indicating that our algorithm can return reliable Bayesian inference. As a computationally intensive case study, we introduce a novel stochastic differential equation model for protein folding data.Comment: 40 pages, 21 figures, 10 table

    Scalable Inference for Markov Processes with Intractable Likelihoods

    Full text link
    Bayesian inference for Markov processes has become increasingly relevant in recent years. Problems of this type often have intractable likelihoods and prior knowledge about model rate parameters is often poor. Markov Chain Monte Carlo (MCMC) techniques can lead to exact inference in such models but in practice can suffer performance issues including long burn-in periods and poor mixing. On the other hand approximate Bayesian computation techniques can allow rapid exploration of a large parameter space but yield only approximate posterior distributions. Here we consider the combined use of approximate Bayesian computation (ABC) and MCMC techniques for improved computational efficiency while retaining exact inference on parallel hardware

    Speeding Up MCMC by Delayed Acceptance and Data Subsampling

    Full text link
    The complexity of the Metropolis-Hastings (MH) algorithm arises from the requirement of a likelihood evaluation for the full data set in each iteration. Payne and Mallick (2015) propose to speed up the algorithm by a delayed acceptance approach where the acceptance decision proceeds in two stages. In the first stage, an estimate of the likelihood based on a random subsample determines if it is likely that the draw will be accepted and, if so, the second stage uses the full data likelihood to decide upon final acceptance. Evaluating the full data likelihood is thus avoided for draws that are unlikely to be accepted. We propose a more precise likelihood estimator which incorporates auxiliary information about the full data likelihood while only operating on a sparse set of the data. We prove that the resulting delayed acceptance MH is more efficient compared to that of Payne and Mallick (2015). The caveat of this approach is that the full data set needs to be evaluated in the second stage. We therefore propose to substitute this evaluation by an estimate and construct a state-dependent approximation thereof to use in the first stage. This results in an algorithm that (i) can use a smaller subsample m by leveraging on recent advances in Pseudo-Marginal MH (PMMH) and (ii) is provably within O(m−2)O(m^{-2}) of the true posterior.Comment: Accepted for publication in Journal of Computational and Graphical Statistic

    Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems

    Full text link
    Approximate Bayesian computation methods can be used to evaluate posterior distributions without having to calculate likelihoods. In this paper we discuss and apply an approximate Bayesian computation (ABC) method based on sequential Monte Carlo (SMC) to estimate parameters of dynamical models. We show that ABC SMC gives information about the inferability of parameters and model sensitivity to changes in parameters, and tends to perform better than other ABC approaches. The algorithm is applied to several well known biological systems, for which parameters and their credible intervals are inferred. Moreover, we develop ABC SMC as a tool for model selection; given a range of different mathematical descriptions, ABC SMC is able to choose the best model using the standard Bayesian model selection apparatus.Comment: 26 pages, 9 figure

    Ensemble MCMC: Accelerating Pseudo-Marginal MCMC for State Space Models using the Ensemble Kalman Filter

    Get PDF
    Particle Markov chain Monte Carlo (pMCMC) is now a popular method for performing Bayesian statistical inference on challenging state space models (SSMs) with unknown static parameters. It uses a particle filter (PF) at each iteration of an MCMC algorithm to unbiasedly estimate the likelihood for a given static parameter value. However, pMCMC can be computationally intensive when a large number of particles in the PF is required, such as when the data are highly informative, the model is misspecified and/or the time series is long. In this paper we exploit the ensemble Kalman filter (EnKF) developed in the data assimilation literature to speed up pMCMC. We replace the unbiased PF likelihood with the biased EnKF likelihood estimate within MCMC to sample over the space of the static parameter. On a wide class of different non-linear SSM models, we demonstrate that our extended ensemble MCMC (eMCMC) methods can significantly reduce the computational cost whilst maintaining reasonable accuracy. We also propose several extensions of the vanilla eMCMC algorithm to further improve computational efficiency. Computer code to implement our methods on all the examples can be downloaded from https://github.com/cdrovandi/Ensemble-MCMC
    • …
    corecore