760 research outputs found

    MCMC methods for functions modifying old algorithms to make\ud them faster

    Get PDF
    Many problems arising in applications result in the need\ud to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes. Standard MCMC algorithms typically become arbitrarily slow under the mesh refinement dictated by nonparametric description of the unknown function. We describe an approach to modifying a whole range of MCMC methods which ensures that their speed of convergence is robust under mesh refinement. In the applications of interest the data is often sparse and the prior specification is an essential part of the overall modeling strategy. The algorithmic approach that we describe is applicable whenever the desired probability measure has density with respect to a Gaussian process or Gaussian random field prior, and to some useful non-Gaussian priors constructed through random truncation. Applications are shown in density estimation, data assimilation in fluid mechanics, subsurface geophysics and image registration. The key design principle is to formulate the MCMC method for functions. This leads to algorithms which can be implemented via minor modification of existing algorithms, yet which show enormous speed-up on a wide range of applied problems

    Analysis of the Gibbs sampler for hierarchical inverse problems

    Get PDF
    Many inverse problems arising in applications come from continuum models where the unknown parameter is a field. In practice the unknown field is discretized resulting in a problem in RN\mathbb{R}^N, with an understanding that refining the discretization, that is increasing NN, will often be desirable. In the context of Bayesian inversion this situation suggests the importance of two issues: (i) defining hyper-parameters in such a way that they are interpretable in the continuum limit N→∞N \to \infty and so that their values may be compared between different discretization levels; (ii) understanding the efficiency of algorithms for probing the posterior distribution, as a function of large N.N. Here we address these two issues in the context of linear inverse problems subject to additive Gaussian noise within a hierarchical modelling framework based on a Gaussian prior for the unknown field and an inverse-gamma prior for a hyper-parameter, namely the amplitude of the prior variance. The structure of the model is such that the Gibbs sampler can be easily implemented for probing the posterior distribution. Subscribing to the dogma that one should think infinite-dimensionally before implementing in finite dimensions, we present function space intuition and provide rigorous theory showing that as NN increases, the component of the Gibbs sampler for sampling the amplitude of the prior variance becomes increasingly slower. We discuss a reparametrization of the prior variance that is robust with respect to the increase in dimension; we give numerical experiments which exhibit that our reparametrization prevents the slowing down. Our intuition on the behaviour of the prior hyper-parameter, with and without reparametrization, is sufficiently general to include a broad class of nonlinear inverse problems as well as other families of hyper-priors.Comment: to appear, SIAM/ASA Journal on Uncertainty Quantificatio

    Statistical Inference for Complex Time Series Data

    Get PDF
    During recent years the focus of scientific interest has turned from low dimensional stationary time series to nonstationary time series and high dimensional time series. In addition new methodological challenges are coming from high frequency finance where data are recorded and analyzed on a millisecond basis. The three topics “nonstationarity”, “high dimensionality” and “high frequency” are on the forefront of present research in time series analysis. The topics also have some overlap in that there already exists work on the intersection of these three topics, e.g. on locally stationary diffusion models, on high dimensional covariance matrices for high frequency data, or on multivariate dynamic factor models for nonstationary processes. The aim of the workshop was to bring together researchers from time series analysis, nonparametric statistics, econometrics and empirical finance to work on these topics. This aim was successfully achieved and the workshops was very well attended

    Lack-of-fit tests in semiparametric mixed models.

    Get PDF
    In this paper we obtain the asymptotic distribution of restricted likelihood ratio tests in mixed linear models with a fixed and finite number of random effects. We explain why for such models the often quoted 50:50 mixture of a chi-s quared random variable with one degree of freedom and a point mass at zero does not hold. Our motivation is a study of the use of wavelets for lack-of-fit testing within a mixed model framework. Even though wavelet shave received a lot of attention in the last say 15 years for the estimation of piecewise smooth functions, much less is known about their ability to check the adequacy of a parametric model when fitting the observed data. In particular we study the testing power of wavelets for testing a hypothesized parametric model within a mixed model framework. Experimental results show that in several situations the wavelet-based test significantly outperforms the com-petitor based on penalized regression splines. The obtained results are also applicable for testing in mixed models in general, and shed some new insight into previous results.Lack-off-fittest; Likelihood ratio test; Mixed models; One-sided test; Penalization; Restricted maximum likelihood; Variance components; Wavel; Asymptotic distribution; Distribution; Likelihood; Tests; Models; Model; Random effects; Effects; Studies; Lack-of-fit; Mixed model; Framework; Functions; Data; Power; Regression;

    Fourier analysis of stationary time series in function space

    Get PDF
    We develop the basic building blocks of a frequency domain framework for drawing statistical inferences on the second-order structure of a stationary sequence of functional data. The key element in such a context is the spectral density operator, which generalises the notion of a spectral density matrix to the functional setting, and characterises the second-order dynamics of the process. Our main tool is the functional Discrete Fourier Transform (fDFT). We derive an asymptotic Gaussian representation of the fDFT, thus allowing the transformation of the original collection of dependent random functions into a collection of approximately independent complex-valued Gaussian random functions. Our results are then employed in order to construct estimators of the spectral density operator based on smoothed versions of the periodogram kernel, the functional generalisation of the periodogram matrix. The consistency and asymptotic law of these estimators are studied in detail. As immediate consequences, we obtain central limit theorems for the mean and the long-run covariance operator of a stationary functional time series. Our results do not depend on structural modelling assumptions, but only functional versions of classical cumulant mixing conditions, and are shown to be stable under discrete observation of the individual curves.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1086 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • 

    corecore