7,093 research outputs found

    Bayesian Conditional Density Filtering

    Full text link
    We propose a Conditional Density Filtering (C-DF) algorithm for efficient online Bayesian inference. C-DF adapts MCMC sampling to the online setting, sampling from approximations to conditional posterior distributions obtained by propagating surrogate conditional sufficient statistics (a function of data and parameter estimates) as new data arrive. These quantities eliminate the need to store or process the entire dataset simultaneously and offer a number of desirable features. Often, these include a reduction in memory requirements and runtime and improved mixing, along with state-of-the-art parameter inference and prediction. These improvements are demonstrated through several illustrative examples including an application to high dimensional compressed regression. Finally, we show that C-DF samples converge to the target posterior distribution asymptotically as sampling proceeds and more data arrives.Comment: 41 pages, 7 figures, 12 table

    The Gibbs Sampler with Particle Efficient Importance Sampling for State-Space Models

    Full text link
    We consider Particle Gibbs (PG) as a tool for Bayesian analysis of non-linear non-Gaussian state-space models. PG is a Monte Carlo (MC) approximation of the standard Gibbs procedure which uses sequential MC (SMC) importance sampling inside the Gibbs procedure to update the latent and potentially high-dimensional state trajectories. We propose to combine PG with a generic and easily implementable SMC approach known as Particle Efficient Importance Sampling (PEIS). By using SMC importance sampling densities which are approximately fully globally adapted to the targeted density of the states, PEIS can substantially improve the mixing and the efficiency of the PG draws from the posterior of the states and the parameters relative to existing PG implementations. The efficiency gains achieved by PEIS are illustrated in PG applications to a univariate stochastic volatility model for asset returns, a non-Gaussian nonlinear local-level model for interest rates, and a multivariate stochastic volatility model for the realized covariance matrix of asset returns

    Efficient State-Space Inference of Periodic Latent Force Models

    Get PDF
    Latent force models (LFM) are principled approaches to incorporating solutions to differential equations within non-parametric inference methods. Unfortunately, the development and application of LFMs can be inhibited by their computational cost, especially when closed-form solutions for the LFM are unavailable, as is the case in many real world problems where these latent forces exhibit periodic behaviour. Given this, we develop a new sparse representation of LFMs which considerably improves their computational efficiency, as well as broadening their applicability, in a principled way, to domains with periodic or near periodic latent forces. Our approach uses a linear basis model to approximate one generative model for each periodic force. We assume that the latent forces are generated from Gaussian process priors and develop a linear basis model which fully expresses these priors. We apply our approach to model the thermal dynamics of domestic buildings and show that it is effective at predicting day-ahead temperatures within the homes. We also apply our approach within queueing theory in which quasi-periodic arrival rates are modelled as latent forces. In both cases, we demonstrate that our approach can be implemented efficiently using state-space methods which encode the linear dynamic systems via LFMs. Further, we show that state estimates obtained using periodic latent force models can reduce the root mean squared error to 17% of that from non-periodic models and 27% of the nearest rival approach which is the resonator model.Comment: 61 pages, 13 figures, accepted for publication in JMLR. Updates from earlier version occur throughout article in response to JMLR review

    Efficient state-space inference of periodic latent force models

    Get PDF
    Latent force models (LFM) are principled approaches to incorporating solutions to differen-tial equations within non-parametric inference methods. Unfortunately, the developmentand application of LFMs can be inhibited by their computational cost, especially whenclosed-form solutions for the LFM are unavailable, as is the case in many real world prob-lems where these latent forces exhibit periodic behaviour. Given this, we develop a newsparse representation of LFMs which considerably improves their computational efficiency,as well as broadening their applicability, in a principled way, to domains with periodic ornear periodic latent forces. Our approach uses a linear basis model to approximate onegenerative model for each periodic force. We assume that the latent forces are generatedfrom Gaussian process priors and develop a linear basis model which fully expresses thesepriors. We apply our approach to model the thermal dynamics of domestic buildings andshow that it is effective at predicting day-ahead temperatures within the homes. We alsoapply our approach within queueing theory in which quasi-periodic arrival rates are mod-elled as latent forces. In both cases, we demonstrate that our approach can be implemented efficiently using state-space methods which encode the linear dynamic systems via LFMs.Further, we show that state estimates obtained using periodic latent force models can re-duce the root mean squared error to 17% of that from non-periodic models and 27% of thenearest rival approach which is the resonator model (S ̈arkk ̈a et al., 2012; Hartikainen et al.,2012.

    Generalized Kernel-based Visual Tracking

    Full text link
    In this work we generalize the plain MS trackers and attempt to overcome standard mean shift trackers' two limitations. It is well known that modeling and maintaining a representation of a target object is an important component of a successful visual tracker. However, little work has been done on building a robust template model for kernel-based MS tracking. In contrast to building a template from a single frame, we train a robust object representation model from a large amount of data. Tracking is viewed as a binary classification problem, and a discriminative classification rule is learned to distinguish between the object and background. We adopt a support vector machine (SVM) for training. The tracker is then implemented by maximizing the classification score. An iterative optimization scheme very similar to MS is derived for this purpose.Comment: 12 page
    • …
    corecore