39,992 research outputs found
Oscillating Gaussian Processes
In this article we introduce and study oscillating Gaussian processes defined
by , where
are free parameters and is either stationary or
self-similar Gaussian process. We study the basic properties of and we
consider estimation of the model parameters. In particular, we show that the
moment estimators converge in and are, when suitably normalised,
asymptotically normal
Deep Gaussian Processes
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a
deep belief network based on Gaussian process mappings. The data is modeled as
the output of a multivariate GP. The inputs to that Gaussian process are then
governed by another GP. A single layer model is equivalent to a standard GP or
the GP latent variable model (GP-LVM). We perform inference in the model by
approximate variational marginalization. This results in a strict lower bound
on the marginal likelihood of the model which we use for model selection
(number of layers and nodes per layer). Deep belief networks are typically
applied to relatively large data sets using stochastic gradient descent for
optimization. Our fully Bayesian treatment allows for the application of deep
models even when data is scarce. Model selection by our variational bound shows
that a five layer hierarchy is justified even when modelling a digit data set
containing only 150 examples.Comment: 9 pages, 8 figures. Appearing in Proceedings of the 16th
International Conference on Artificial Intelligence and Statistics (AISTATS)
201
Distributed Gaussian Processes
To scale Gaussian processes (GPs) to large data sets we introduce the robust
Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts
model for large-scale distributed GP regression. Unlike state-of-the-art sparse
GP approximations, the rBCM is conceptually simple and does not rely on
inducing or variational parameters. The key idea is to recursively distribute
computations to independent computational units and, subsequently, recombine
them to form an overall result. Efficient closed-form inference allows for
straightforward parallelisation and distributed computations with a small
memory footprint. The rBCM is independent of the computational graph and can be
used on heterogeneous computing infrastructures, ranging from laptops to
clusters. With sufficient computing resources our distributed GP model can
handle arbitrarily large data sets.Comment: 10 pages, 5 figures. Appears in Proceedings of ICML 201
Extremes of Independent Gaussian Processes
For every , let be independent copies of a
zero-mean Gaussian process . We describe all processes
which can be obtained as limits, as , of the process
, where and are
normalizing constants. We also provide an analogous characterization for the
limits of the process , where .Comment: 19 page
- β¦