7,990 research outputs found

    Stochastic Models Involving Second Order LĂ©vy Motions

    Get PDF
    This thesis is based on five papers (A-E) treating estimation methods for unbounded densities, random fields generated by Lévy processes, behavior of Lévy processes at level crossings, and a Markov random field mixtures of multivariate Gaussian fields. In Paper A we propose an estimator of the location parameter for a density that is unbounded at the mode. The estimator maximizes a modified likelihood in which the singular term in the full likelihood is left out, whenever the parameter value approaches a neighborhood of the singularity location. The consistency and super-efficiency of this maximum leave-one-out likelihood estimator is shown through a direct argument. In Paper B we prove that the generalized Laplace distribution and the normal inverse Gaussian distribution are the only subclasses of the generalized hyperbolic distribution that are closed under convolution. In Paper C we propose a non-Gaussian Matérn random field models, generated through stochastic partial differential equations, with the class of generalized Hyperbolic processes as noise forcings. A maximum likelihood estimation technique based on the Monte Carlo Expectation Maximization algorithm is presented, and it is shown how to preform predictions at unobserved locations. In Paper D a novel class of models is introduced, denoted latent Gaussian random filed mixture models, which combines the Markov random field mixture model with the latent Gaussian random field models. The latent model, which is observed under a measurement noise, is defined as a mixture of several, possible multivariate, Gaussian random fields. Selection of which of the fields is observed at each location is modeled using a discrete Markov random field. Efficient estimation methods for the parameter of the models is developed using a stochastic gradient algorithm. In Paper E studies the behaviour of level crossing of non-Gaussian time series through a Slepian model. The approach is through developing a Slepian model for underlying random noise that drives the process which crosses the level. It is demonstrated how a moving average time series driven by Laplace noise can be analyzed through the Slepian noise approach. Methods for sampling the biased sampling distribution of the noise are based on an Gibbs sampler

    Scalable iterative methods for sampling from massive Gaussian random vectors

    Full text link
    Sampling from Gaussian Markov random fields (GMRFs), that is multivariate Gaussian ran- dom vectors that are parameterised by the inverse of their covariance matrix, is a fundamental problem in computational statistics. In this paper, we show how we can exploit arbitrarily accu- rate approximations to a GMRF to speed up Krylov subspace sampling methods. We also show that these methods can be used when computing the normalising constant of a large multivariate Gaussian distribution, which is needed for both any likelihood-based inference method. The method we derive is also applicable to other structured Gaussian random vectors and, in particu- lar, we show that when the precision matrix is a perturbation of a (block) circulant matrix, it is still possible to derive O(n log n) sampling schemes.Comment: 17 Pages, 4 Figure

    Computing the Cramer-Rao bound of Markov random field parameters: Application to the Ising and the Potts models

    Get PDF
    This report considers the problem of computing the Cramer-Rao bound for the parameters of a Markov random field. Computation of the exact bound is not feasible for most fields of interest because their likelihoods are intractable and have intractable derivatives. We show here how it is possible to formulate the computation of the bound as a statistical inference problem that can be solve approximately, but with arbitrarily high accuracy, by using a Monte Carlo method. The proposed methodology is successfully applied on the Ising and the Potts models.% where it is used to assess the performance of three state-of-the art estimators of the parameter of these Markov random fields

    Distributed Parameter Estimation in Probabilistic Graphical Models

    Full text link
    This paper presents foundational theoretical results on distributed parameter estimation for undirected probabilistic graphical models. It introduces a general condition on composite likelihood decompositions of these models which guarantees the global consistency of distributed estimators, provided the local estimators are consistent
    • 

    corecore