4,251 research outputs found
Recommended from our members
Simulated convergence rates with application to an intractable α-stable inference problem
© 2017 IEEE. We report the results of a series of numerical studies examining the convergence rate for some approximate representations of α-stable distributions, which are a highly intractable class of distributions for inference purposes. Our proposed representation turns the intractable inference for an infinite-dimensional series of parameters into an (approximately) conditionally Gaussian representation, to which standard inference procedures such as Expectation-Maximization (EM), Markov chain Monte Carlo (MCMC) and Particle Filtering can be readily applied. While we have previously proved the asymptotic convergence of this representation, here we study the rate of this convergence for finite values of a truncation parameter, c. This allows the selection of appropriate truncations for different parameter configurations and for the accuracy required for the model. The convergence is examined directly in terms of cumulative distribution functions and densities, through the application of the Berry theorems and Parseval theorems. Our results indicate that the behaviour of our representations is significantly superior to that of representations that simply truncate the series with no Gaussian residual term
Bayesian inference for stochastic differential equation mixed effects models of a tumor xenography study
We consider Bayesian inference for stochastic differential equation mixed
effects models (SDEMEMs) exemplifying tumor response to treatment and regrowth
in mice. We produce an extensive study on how a SDEMEM can be fitted using both
exact inference based on pseudo-marginal MCMC and approximate inference via
Bayesian synthetic likelihoods (BSL). We investigate a two-compartments SDEMEM,
these corresponding to the fractions of tumor cells killed by and survived to a
treatment, respectively. Case study data considers a tumor xenography study
with two treatment groups and one control, each containing 5-8 mice. Results
from the case study and from simulations indicate that the SDEMEM is able to
reproduce the observed growth patterns and that BSL is a robust tool for
inference in SDEMEMs. Finally, we compare the fit of the SDEMEM to a similar
ordinary differential equation model. Due to small sample sizes, strong prior
information is needed to identify all model parameters in the SDEMEM and it
cannot be determined which of the two models is the better in terms of
predicting tumor growth curves. In a simulation study we find that with a
sample of 17 mice per group BSL is able to identify all model parameters and
distinguish treatment groups.Comment: Minor revision: posterior predictive checks for BSL have ben updated
(both theory and results). Code on GitHub has ben revised accordingl
Metropolis Sampling
Monte Carlo (MC) sampling methods are widely applied in Bayesian inference,
system simulation and optimization problems. The Markov Chain Monte Carlo
(MCMC) algorithms are a well-known class of MC methods which generate a Markov
chain with the desired invariant distribution. In this document, we focus on
the Metropolis-Hastings (MH) sampler, which can be considered as the atom of
the MCMC techniques, introducing the basic notions and different properties. We
describe in details all the elements involved in the MH algorithm and the most
relevant variants. Several improvements and recent extensions proposed in the
literature are also briefly discussed, providing a quick but exhaustive
overview of the current Metropolis-based sampling's world.Comment: Wiley StatsRef-Statistics Reference Online, 201
Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels
Monte Carlo algorithms often aim to draw from a distribution by
simulating a Markov chain with transition kernel such that is
invariant under . However, there are many situations for which it is
impractical or impossible to draw from the transition kernel . For instance,
this is the case with massive datasets, where is it prohibitively expensive to
calculate the likelihood and is also the case for intractable likelihood models
arising from, for example, Gibbs random fields, such as those found in spatial
statistics and network analysis. A natural approach in these cases is to
replace by an approximation . Using theory from the stability of
Markov chains we explore a variety of situations where it is possible to
quantify how 'close' the chain given by the transition kernel is to
the chain given by . We apply these results to several examples from spatial
statistics and network analysis.Comment: This version: results extended to non-uniformly ergodic Markov chain
Approximate Bayesian computation (ABC) gives exact results under the assumption of model error
Approximate Bayesian computation (ABC) or likelihood-free inference
algorithms are used to find approximations to posterior distributions without
making explicit use of the likelihood function, depending instead on simulation
of sample data sets from the model. In this paper we show that under the
assumption of the existence of a uniform additive model error term, ABC
algorithms give exact results when sufficient summaries are used. This
interpretation allows the approximation made in many previous application
papers to be understood, and should guide the choice of metric and tolerance in
future work. ABC algorithms can be generalized by replacing the 0-1 cut-off
with an acceptance probability that varies with the distance of the simulated
data from the observed data. The acceptance density gives the distribution of
the error term, enabling the uniform error usually used to be replaced by a
general distribution. This generalization can also be applied to approximate
Markov chain Monte Carlo algorithms. In light of this work, ABC algorithms can
be seen as calibration techniques for implicit stochastic models, inferring
parameter values in light of the computer model, data, prior beliefs about the
parameter values, and any measurement or model errors.Comment: 33 pages, 1 figure, to appear in Statistical Applications in Genetics
and Molecular Biology 201
Variational Bayes with Intractable Likelihood
Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian
inference in statistical modeling. However, the existing VB algorithms are
restricted to cases where the likelihood is tractable, which precludes the use
of VB in many interesting situations such as in state space models and in
approximate Bayesian computation (ABC), where application of VB methods was
previously impossible. This paper extends the scope of application of VB to
cases where the likelihood is intractable, but can be estimated unbiasedly. The
proposed VB method therefore makes it possible to carry out Bayesian inference
in many statistical applications, including state space models and ABC. The
method is generic in the sense that it can be applied to almost all statistical
models without requiring too much model-based derivation, which is a drawback
of many existing VB algorithms. We also show how the proposed method can be
used to obtain highly accurate VB approximations of marginal posterior
distributions.Comment: 40 pages, 6 figure
Sequential Monte Carlo Methods for System Identification
One of the key challenges in identifying nonlinear and possibly non-Gaussian
state space models (SSMs) is the intractability of estimating the system state.
Sequential Monte Carlo (SMC) methods, such as the particle filter (introduced
more than two decades ago), provide numerical solutions to the nonlinear state
estimation problems arising in SSMs. When combined with additional
identification techniques, these algorithms provide solid solutions to the
nonlinear system identification problem. We describe two general strategies for
creating such combinations and discuss why SMC is a natural tool for
implementing these strategies.Comment: In proceedings of the 17th IFAC Symposium on System Identification
(SYSID). Added cover pag
- …