3,155 research outputs found

    A Mutually-Dependent Hadamard Kernel for Modelling Latent Variable Couplings

    Full text link
    We introduce a novel kernel that models input-dependent couplings across multiple latent processes. The pairwise joint kernel measures covariance along inputs and across different latent signals in a mutually-dependent fashion. A latent correlation Gaussian process (LCGP) model combines these non-stationary latent components into multiple outputs by an input-dependent mixing matrix. Probit classification and support for multiple observation sets are derived by Variational Bayesian inference. Results on several datasets indicate that the LCGP model can recover the correlations between latent signals while simultaneously achieving state-of-the-art performance. We highlight the latent covariances with an EEG classification dataset where latent brain processes and their couplings simultaneously emerge from the model.Comment: 17 pages, 6 figures; accepted to ACML 201

    Aggregation of predictors for nonstationary sub-linear processes and online adaptive forecasting of time varying autoregressive processes

    Full text link
    In this work, we study the problem of aggregating a finite number of predictors for nonstationary sub-linear processes. We provide oracle inequalities relying essentially on three ingredients: (1) a uniform bound of the â„“1\ell^1 norm of the time varying sub-linear coefficients, (2) a Lipschitz assumption on the predictors and (3) moment conditions on the noise appearing in the linear representation. Two kinds of aggregations are considered giving rise to different moment conditions on the noise and more or less sharp oracle inequalities. We apply this approach for deriving an adaptive predictor for locally stationary time varying autoregressive (TVAR) processes. It is obtained by aggregating a finite number of well chosen predictors, each of them enjoying an optimal minimax convergence rate under specific smoothness conditions on the TVAR coefficients. We show that the obtained aggregated predictor achieves a minimax rate while adapting to the unknown smoothness. To prove this result, a lower bound is established for the minimax rate of the prediction risk for the TVAR process. Numerical experiments complete this study. An important feature of this approach is that the aggregated predictor can be computed recursively and is thus applicable in an online prediction context.Comment: Published at http://dx.doi.org/10.1214/15-AOS1345 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Non-separable non-stationary random fields

    Get PDF
    We describe a framework for constructing nonsta- tionary nonseparable random fields that are based on an infinite mixture of convolved stochastic processes. When the mixing process is station- ary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolu- tion function is stationary we arrive at nonsepara- ble random fields that have varying nonseparabil- ity and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of syn- thetic benchmarks and spatio-temporal applica- tions in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and sta- tistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions
    • …
    corecore