84,733 research outputs found

    Information-geometric Markov Chain Monte Carlo methods using Diffusions

    Get PDF
    Recent work incorporating geometric ideas in Markov chain Monte Carlo is reviewed in order to highlight these advances and their possible application in a range of domains beyond Statistics. A full exposition of Markov chains and their use in Monte Carlo simulation for Statistical inference and molecular dynamics is provided, with particular emphasis on methods based on Langevin diffusions. After this geometric concepts in Markov chain Monte Carlo are introduced. A full derivation of the Langevin diffusion on a Riemannian manifold is given, together with a discussion of appropriate Riemannian metric choice for different problems. A survey of applications is provided, and some open questions are discussed.Comment: 22 pages, 2 figure

    Particle Gibbs with Ancestor Sampling

    Full text link
    Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a novel PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an off-the-shelf class of Markov kernels that can be used to simulate the typically high-dimensional and highly autocorrelated state trajectory in a state-space model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in state-space models, but also in models with more complex dependencies, such as non-Markovian, Bayesian nonparametric, and general probabilistic graphical models

    Fast spatial inference in the homogeneous Ising model

    Get PDF
    The Ising model is important in statistical modeling and inference in many applications, however its normalizing constant, mean number of active vertices and mean spin interaction are intractable. We provide accurate approximations that make it possible to calculate these quantities numerically. Simulation studies indicate good performance when compared to Markov Chain Monte Carlo methods and at a tiny fraction of the time. The methodology is also used to perform Bayesian inference in a functional Magnetic Resonance Imaging activation detection experiment.Comment: 18 pages, 1 figure, 3 table

    Bayesian Estimation for Parameters of Power Function Distribution under Various Priors

    Get PDF
    Although the idea of Bayesian inference dates back to the late 18th century, its use by statisticians has been rare until recently. But due to advancement in the simulation techniques Bayesian inference and estimation is gaining currency. This paper seeks to focus on the Bayesian estimates of the Power Function distribution using Weibull and Generalized Gamma distributions as priors for the unknown parameters. Furthermore, the statistical performance of the obtained estimators is compared with the Maximum likelihood of Power Function distribution and the Bayesian estimator of Gamma distribution as prior of the unknown parameter. The comparison has been done using Monte Carlo simulation using MSE as yardstick of the comparison. Keywords: Squared error loss function, Bayesian estimator, Prior distribution, Monte Carlo simulation

    Applications of Monte Carlo Methods in Statistical Inference Using Regression Analysis

    Get PDF
    This paper studies the use of Monte Carlo simulation techniques in the field of econometrics, specifically statistical inference. First, I examine several estimators by deriving properties explicitly and generate their distributions through simulations. Here, simulations are used to illustrate and support the analytical results. Then, I look at test statistics where derivations are costly because of the sensitivity of their critical values to the data generating processes. Simulations here establish significance and necessity for drawing statistical inference. Overall, the paper examines when and how simulations are needed in studying econometric theories

    A Survey of Stochastic Simulation and Optimization Methods in Signal Processing

    Get PDF
    International audienceModern signal processing (SP) methods rely very heavily on probability and statistics to solve challenging SP problems. SP methods are now expected to deal with ever more complex models, requiring ever more sophisticated computational inference techniques. This has driven the development of statistical SP methods based on stochastic simulation and optimization. Stochastic simulation and optimization algorithms are computationally intensive tools for performing statistical inference in models that are anal ytically intractable and beyond the scope of deterministic inference methods. They have been recently successfully applied to many difficult problems involving complex statistical models and sophisticated (often Bayesian) statistical inference techniques. This survey paper offers an introduction to stochastic simulation and optimization methods in signal and image processing. The paper addresses a variety of high-dimensional Markov chain Monte Carlo (MCMC) methods as well as deterministic surrogate methods, such as variational Bayes, the Bethe approach, belief and expectation propagation and approximate message passing algorithms. It also discusses a range of optimization methods that have been adopted to solve stochastic problems, as well as stochastic methods for deterministic optimization. Subsequently, area as of overlap between simulation and optimization, in particular optimization-within-MCMC and MCMC-driven optimization are discussed

    Exact statistical inferences and Monte Carlo method

    Get PDF
    © 2014, Pleiades Publishing, Ltd. It is shown that in some situations, for example, models invariant under certain groups of transformations, search constants determining statistical inference can be organized by random simulation while maintaining the nominal level of reliability. It is established that the accuracy of statistical inference varies depending on the number of replications M of the Monte Carlo method of the order M−1. Some examples (confidence intervals for the center of the Cauchy distribution, upper bound for scale Laplace parameter, discrimination between the normal and Cauchy distributions, discrimination between exponential and log-normal distributions) show that an acceptable accuracy of statistical inference is achieved when the number of Monte Carlo replications M >100
    • 

    corecore