911,180 research outputs found

    Deep Gaussian Mixture Models

    Get PDF
    Deep learning is a hierarchical inference method formed by subsequent multiple layers of learning able to more efficiently describe complex relationships. In this work, Deep Gaussian Mixture Models are introduced and discussed. A Deep Gaussian Mixture model (DGMM) is a network of multiple layers of latent variables, where, at each layer, the variables follow a mixture of Gaussian distributions. Thus, the deep mixture model consists of a set of nested mixtures of linear models, which globally provide a nonlinear model able to describe the data in a very flexible way. In order to avoid overparameterized solutions, dimension reduction by factor models can be applied at each layer of the architecture thus resulting in deep mixtures of factor analysers.Comment: 19 pages, 4 figure

    On generalizing Gaussian graphical models

    Get PDF
    Preprin

    Positivity for Gaussian graphical models

    Full text link
    Gaussian graphical models are parametric statistical models for jointly normal random variables whose dependence structure is determined by a graph. In previous work, we introduced trek separation, which gives a necessary and sufficient condition in terms of the graph for when a subdeterminant is zero for all covariance matrices that belong to the Gaussian graphical model. Here we extend this result to give explicit cancellation-free formulas for the expansions of nonzero subdeterminants.Comment: 16 pages, 3 figure

    Gaussian process model based predictive control

    Get PDF
    Gaussian process models provide a probabilistic non-parametric modelling approach for black-box identification of non-linear dynamic systems. The Gaussian processes can highlight areas of the input space where prediction quality is poor, due to the lack of data or its complexity, by indicating the higher variance around the predicted mean. Gaussian process models contain noticeably less coefficients to be optimized. This paper illustrates possible application of Gaussian process models within model-based predictive control. The extra information provided within Gaussian process model is used in predictive control, where optimization of control signal takes the variance information into account. The predictive control principle is demonstrated on control of pH process benchmark

    Gaussian process models for periodicity detection

    Get PDF
    We consider the problem of detecting and quantifying the periodic component of a function given noise-corrupted observations of a limited number of input/output tuples. Our approach is based on Gaussian process regression which provides a flexible non-parametric framework for modelling periodic data. We introduce a novel decomposition of the covariance function as the sum of periodic and aperiodic kernels. This decomposition allows for the creation of sub-models which capture the periodic nature of the signal and its complement. To quantify the periodicity of the signal, we derive a periodicity ratio which reflects the uncertainty in the fitted sub-models. Although the method can be applied to many kernels, we give a special emphasis to the Mat\'ern family, from the expression of the reproducing kernel Hilbert space inner product to the implementation of the associated periodic kernels in a Gaussian process toolkit. The proposed method is illustrated by considering the detection of periodically expressed genes in the arabidopsis genome.Comment: in PeerJ Computer Science, 201
    corecore