23,304 research outputs found
Linear latent force models using Gaussian processes.
Purely data-driven approaches for machine learning present difficulties when data are scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (which may not be feasible) and still leave the issue of how to parameterize the system. In this paper, we present a hybrid approach using Gaussian processes and differential equations to combine data-driven modeling with a physical model of the system. We show how different, physically inspired, kernel functions can be developed through sensible, simple, mechanistic assumptions about the underlying system. The versatility of our approach is illustrated with three case studies from motion capture, computational biology, and geostatistics
Efficient state-space inference of periodic latent force models
Latent force models (LFM) are principled approaches to incorporating solutions to differen-tial equations within non-parametric inference methods. Unfortunately, the developmentand application of LFMs can be inhibited by their computational cost, especially whenclosed-form solutions for the LFM are unavailable, as is the case in many real world prob-lems where these latent forces exhibit periodic behaviour. Given this, we develop a newsparse representation of LFMs which considerably improves their computational efficiency,as well as broadening their applicability, in a principled way, to domains with periodic ornear periodic latent forces. Our approach uses a linear basis model to approximate onegenerative model for each periodic force. We assume that the latent forces are generatedfrom Gaussian process priors and develop a linear basis model which fully expresses thesepriors. We apply our approach to model the thermal dynamics of domestic buildings andshow that it is effective at predicting day-ahead temperatures within the homes. We alsoapply our approach within queueing theory in which quasi-periodic arrival rates are mod-elled as latent forces. In both cases, we demonstrate that our approach can be implemented efficiently using state-space methods which encode the linear dynamic systems via LFMs.Further, we show that state estimates obtained using periodic latent force models can re-duce the root mean squared error to 17% of that from non-periodic models and 27% of thenearest rival approach which is the resonator model (S ̈arkk ̈a et al., 2012; Hartikainen et al.,2012.
Efficient State-Space Inference of Periodic Latent Force Models
Latent force models (LFM) are principled approaches to incorporating
solutions to differential equations within non-parametric inference methods.
Unfortunately, the development and application of LFMs can be inhibited by
their computational cost, especially when closed-form solutions for the LFM are
unavailable, as is the case in many real world problems where these latent
forces exhibit periodic behaviour. Given this, we develop a new sparse
representation of LFMs which considerably improves their computational
efficiency, as well as broadening their applicability, in a principled way, to
domains with periodic or near periodic latent forces. Our approach uses a
linear basis model to approximate one generative model for each periodic force.
We assume that the latent forces are generated from Gaussian process priors and
develop a linear basis model which fully expresses these priors. We apply our
approach to model the thermal dynamics of domestic buildings and show that it
is effective at predicting day-ahead temperatures within the homes. We also
apply our approach within queueing theory in which quasi-periodic arrival rates
are modelled as latent forces. In both cases, we demonstrate that our approach
can be implemented efficiently using state-space methods which encode the
linear dynamic systems via LFMs. Further, we show that state estimates obtained
using periodic latent force models can reduce the root mean squared error to
17% of that from non-periodic models and 27% of the nearest rival approach
which is the resonator model.Comment: 61 pages, 13 figures, accepted for publication in JMLR. Updates from
earlier version occur throughout article in response to JMLR review
Doubly Stochastic Variational Inference for Deep Gaussian Processes
Gaussian processes (GPs) are a good choice for function approximation as they
are flexible, robust to over-fitting, and provide well-calibrated predictive
uncertainty. Deep Gaussian processes (DGPs) are multi-layer generalisations of
GPs, but inference in these models has proved challenging. Existing approaches
to inference in DGP models assume approximate posteriors that force
independence between the layers, and do not work well in practice. We present a
doubly stochastic variational inference algorithm, which does not force
independence between layers. With our method of inference we demonstrate that a
DGP model can be used effectively on data ranging in size from hundreds to a
billion points. We provide strong empirical evidence that our inference scheme
for DGPs works well in practice in both classification and regression.Comment: NIPS 201
- …