53,670 research outputs found

    Scalable iterative methods for sampling from massive Gaussian random vectors

    Full text link
    Sampling from Gaussian Markov random fields (GMRFs), that is multivariate Gaussian ran- dom vectors that are parameterised by the inverse of their covariance matrix, is a fundamental problem in computational statistics. In this paper, we show how we can exploit arbitrarily accu- rate approximations to a GMRF to speed up Krylov subspace sampling methods. We also show that these methods can be used when computing the normalising constant of a large multivariate Gaussian distribution, which is needed for both any likelihood-based inference method. The method we derive is also applicable to other structured Gaussian random vectors and, in particu- lar, we show that when the precision matrix is a perturbation of a (block) circulant matrix, it is still possible to derive O(n log n) sampling schemes.Comment: 17 Pages, 4 Figure

    State-Space Inference and Learning with Gaussian Processes

    No full text
    State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose a new, general methodology for inference and learning in nonlinear state-space models that are described probabilistically by non-parametric GP models. We apply the expectation maximization algorithm to iterate between inference in the latent state-space and learning the parameters of the underlying GP dynamics model. Copyright 2010 by the authors

    Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)

    Full text link
    We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.Comment: 19 pages, 4 figure
    • …
    corecore