2,223 research outputs found

    A spatial analysis of multivariate output from regional climate models

    Get PDF
    Climate models have become an important tool in the study of climate and climate change, and ensemble experiments consisting of multiple climate-model runs are used in studying and quantifying the uncertainty in climate-model output. However, there are often only a limited number of model runs available for a particular experiment, and one of the statistical challenges is to characterize the distribution of the model output. To that end, we have developed a multivariate hierarchical approach, at the heart of which is a new representation of a multivariate Markov random field. This approach allows for flexible modeling of the multivariate spatial dependencies, including the cross-dependencies between variables. We demonstrate this statistical model on an ensemble arising from a regional-climate-model experiment over the western United States, and we focus on the projected change in seasonal temperature and precipitation over the next 50 years.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS369 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Does waste-recycling really improve Metropolis-Hastings Monte Carlo algorithm?

    Get PDF
    The Metropolis Hastings algorithm and its multi-proposal extensions are aimed at the computation of the expectation of a function $f$ under a probability measure $\pi$ difficult to simulate. They consist in constructing by an appropriate acceptation/rejection procedure a Markov chain $(X_k,k\geq 0)$ with transition matrix $P$ such that $\pi$ is reversible with respect to $P$ and in estimating by the empirical mean I_n(f)=\inv{n}\sum_{k=1}^n f(X_k). The waste-recycling Monte Carlo (WR) algorithm introduced by physicists is a modification of the Metropolis-Hastings algorithm, which makes use of all the proposals in the empirical mean, whereas the standard Metropolis-Hastings algorithm only uses the accepted proposals. In this paper, we extend the WR algorithm into a general control variate technique and exhibit the optimal choice of the control variate in terms of asymptotic variance. We also give an example which shows that in contradiction to the intuition of physicists, the WR algorithm can have an asymptotic variance larger than the one of the Metropolis-Hastings algorithm. However, in the particular case of the Metropolis-Hastings algorithm called Boltzmann algorithm, we prove that the WR algorithm is asymptotically better than the Metropolis-Hastings algorithm

    Multilevel Markov Chain Monte Carlo Method for High-Contrast Single-Phase Flow Problems

    Full text link
    In this paper we propose a general framework for the uncertainty quantification of quantities of interest for high-contrast single-phase flow problems. It is based on the generalized multiscale finite element method (GMsFEM) and multilevel Monte Carlo (MLMC) methods. The former provides a hierarchy of approximations of different resolution, whereas the latter gives an efficient way to estimate quantities of interest using samples on different levels. The number of basis functions in the online GMsFEM stage can be varied to determine the solution resolution and the computational cost, and to efficiently generate samples at different levels. In particular, it is cheap to generate samples on coarse grids but with low resolution, and it is expensive to generate samples on fine grids with high accuracy. By suitably choosing the number of samples at different levels, one can leverage the expensive computation in larger fine-grid spaces toward smaller coarse-grid spaces, while retaining the accuracy of the final Monte Carlo estimate. Further, we describe a multilevel Markov chain Monte Carlo method, which sequentially screens the proposal with different levels of approximations and reduces the number of evaluations required on fine grids, while combining the samples at different levels to arrive at an accurate estimate. The framework seamlessly integrates the multiscale features of the GMsFEM with the multilevel feature of the MLMC methods following the work in \cite{ketelson2013}, and our numerical experiments illustrate its efficiency and accuracy in comparison with standard Monte Carlo estimates.Comment: 29 pages, 6 figure

    Improving the precision matrix for precision cosmology

    Get PDF
    The estimation of cosmological constraints from observations of the large scale structure of the Universe, such as the power spectrum or the correlation function, requires the knowledge of the inverse of the associated covariance matrix, namely the precision matrix, Ψ\mathbf{\Psi}. In most analyses, Ψ\mathbf{\Psi} is estimated from a limited set of mock catalogues. Depending on how many mocks are used, this estimation has an associated error which must be propagated into the final cosmological constraints. For future surveys such as Euclid and DESI, the control of this additional uncertainty requires a prohibitively large number of mock catalogues. In this work we test a novel technique for the estimation of the precision matrix, the covariance tapering method, in the context of baryon acoustic oscillation measurements. Even though this technique was originally devised as a way to speed up maximum likelihood estimations, our results show that it also reduces the impact of noisy precision matrix estimates on the derived confidence intervals, without introducing biases on the target parameters. The application of this technique can help future surveys to reach their true constraining power using a significantly smaller number of mock catalogues.Comment: 9 pages, 7 figures, minor changes to match version accepted by MNRA
    corecore