5,044 research outputs found
Implicit particle methods and their connection with variational data assimilation
The implicit particle filter is a sequential Monte Carlo method for data
assimilation that guides the particles to the high-probability regions via a
sequence of steps that includes minimizations. We present a new and more
general derivation of this approach and extend the method to particle smoothing
as well as to data assimilation for perfect models. We show that the
minimizations required by implicit particle methods are similar to the ones one
encounters in variational data assimilation and explore the connection of
implicit particle methods with variational data assimilation. In particular, we
argue that existing variational codes can be converted into implicit particle
methods at a low cost, often yielding better estimates, that are also equipped
with quantitative measures of the uncertainty. A detailed example is presented
Incremental Learning of Nonparametric Bayesian Mixture Models
Clustering is a fundamental task in many vision applications.
To date, most clustering algorithms work in a
batch setting and training examples must be gathered in a
large group before learning can begin. Here we explore
incremental clustering, in which data can arrive continuously.
We present a novel incremental model-based clustering
algorithm based on nonparametric Bayesian methods,
which we call Memory Bounded Variational Dirichlet
Process (MB-VDP). The number of clusters are determined
flexibly by the data and the approach can be used to automatically
discover object categories. The computational requirements
required to produce model updates are bounded
and do not grow with the amount of data processed. The
technique is well suited to very large datasets, and we show
that our approach outperforms existing online alternatives
for learning nonparametric Bayesian mixture models
A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms
The benefits of automating design cycles for Bayesian inference-based
algorithms are becoming increasingly recognized by the machine learning
community. As a result, interest in probabilistic programming frameworks has
much increased over the past few years. This paper explores a specific
probabilistic programming paradigm, namely message passing in Forney-style
factor graphs (FFGs), in the context of automated design of efficient Bayesian
signal processing algorithms. To this end, we developed "ForneyLab"
(https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message
passing-based inference in FFGs. We show by example how ForneyLab enables
automatic derivation of Bayesian signal processing algorithms, including
algorithms for parameter estimation and model comparison. Crucially, due to the
modular makeup of the FFG framework, both the model specification and inference
methods are readily extensible in ForneyLab. In order to test this framework,
we compared variational message passing as implemented by ForneyLab with
automatic differentiation variational inference (ADVI) and Monte Carlo methods
as implemented by state-of-the-art tools "Edward" and "Stan". In terms of
performance, extensibility and stability issues, ForneyLab appears to enjoy an
edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate
Reasonin
Extended Object Tracking: Introduction, Overview and Applications
This article provides an elaborate overview of current research in extended
object tracking. We provide a clear definition of the extended object tracking
problem and discuss its delimitation to other types of object tracking. Next,
different aspects of extended object modelling are extensively discussed.
Subsequently, we give a tutorial introduction to two basic and well used
extended object tracking approaches - the random matrix approach and the Kalman
filter-based approach for star-convex shapes. The next part treats the tracking
of multiple extended objects and elaborates how the large number of feasible
association hypotheses can be tackled using both Random Finite Set (RFS) and
Non-RFS multi-object trackers. The article concludes with a summary of current
applications, where four example applications involving camera, X-band radar,
light detection and ranging (lidar), red-green-blue-depth (RGB-D) sensors are
highlighted.Comment: 30 pages, 19 figure
Joint state-parameter estimation of a nonlinear stochastic energy balance model from sparse noisy data
While nonlinear stochastic partial differential equations arise naturally in
spatiotemporal modeling, inference for such systems often faces two major
challenges: sparse noisy data and ill-posedness of the inverse problem of
parameter estimation. To overcome the challenges, we introduce a strongly
regularized posterior by normalizing the likelihood and by imposing physical
constraints through priors of the parameters and states. We investigate joint
parameter-state estimation by the regularized posterior in a physically
motivated nonlinear stochastic energy balance model (SEBM) for paleoclimate
reconstruction. The high-dimensional posterior is sampled by a particle Gibbs
sampler that combines MCMC with an optimal particle filter exploiting the
structure of the SEBM. In tests using either Gaussian or uniform priors based
on the physical range of parameters, the regularized posteriors overcome the
ill-posedness and lead to samples within physical ranges, quantifying the
uncertainty in estimation. Due to the ill-posedness and the regularization, the
posterior of parameters presents a relatively large uncertainty, and
consequently, the maximum of the posterior, which is the minimizer in a
variational approach, can have a large variation. In contrast, the posterior of
states generally concentrates near the truth, substantially filtering out
observation noise and reducing uncertainty in the unconstrained SEBM
- …