3,229 research outputs found
A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning
This paper takes a step towards temporal reasoning in a dynamically changing
video, not in the pixel space that constitutes its frames, but in a latent
space that describes the non-linear dynamics of the objects in its world. We
introduce the Kalman variational auto-encoder, a framework for unsupervised
learning of sequential data that disentangles two latent representations: an
object's representation, coming from a recognition model, and a latent state
describing its dynamics. As a result, the evolution of the world can be
imagined and missing data imputed, both without the need to generate high
dimensional frames at each time step. The model is trained end-to-end on videos
of a variety of simulated physical systems, and outperforms competing methods
in generative and missing data imputation tasks.Comment: NIPS 201
Data Assimilation by Conditioning on Future Observations
Conventional recursive filtering approaches, designed for quantifying the
state of an evolving uncertain dynamical system with intermittent observations,
use a sequence of (i) an uncertainty propagation step followed by (ii) a step
where the associated data is assimilated using Bayes' rule. In this paper we
switch the order of the steps to: (i) one step ahead data assimilation followed
by (ii) uncertainty propagation. This route leads to a class of filtering
algorithms named \emph{smoothing filters}. For a system driven by random noise,
our proposed methods require the probability distribution of the driving noise
after the assimilation to be biased by a nonzero mean. The system noise,
conditioned on future observations, in turn pushes forward the filtering
solution in time closer to the true state and indeed helps to find a more
accurate approximate solution for the state estimation problem
A variational approach to path estimation and parameter inference of hidden diffusion processes
We consider a hidden Markov model, where the signal process, given by a
diffusion, is only indirectly observed through some noisy measurements. The
article develops a variational method for approximating the hidden states of
the signal process given the full set of observations. This, in particular,
leads to systematic approximations of the smoothing densities of the signal
process. The paper then demonstrates how an efficient inference scheme, based
on this variational approach to the approximation of the hidden states, can be
designed to estimate the unknown parameters of stochastic differential
equations. Two examples at the end illustrate the efficacy and the accuracy of
the presented method.Comment: 37 pages, 2 figures, revise
Learning the dynamics and time-recursive boundary detection of deformable objects
We propose a principled framework for recursively segmenting deformable objects across a sequence
of frames. We demonstrate the usefulness of this method on left ventricular segmentation across a cardiac
cycle. The approach involves a technique for learning the system dynamics together with methods of
particle-based smoothing as well as non-parametric belief propagation on a loopy graphical model capturing
the temporal periodicity of the heart. The dynamic system state is a low-dimensional representation
of the boundary, and the boundary estimation involves incorporating curve evolution into recursive state
estimation. By formulating the problem as one of state estimation, the segmentation at each particular
time is based not only on the data observed at that instant, but also on predictions based on past and future
boundary estimates. Although the paper focuses on left ventricle segmentation, the method generalizes
to temporally segmenting any deformable object
A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms
The benefits of automating design cycles for Bayesian inference-based
algorithms are becoming increasingly recognized by the machine learning
community. As a result, interest in probabilistic programming frameworks has
much increased over the past few years. This paper explores a specific
probabilistic programming paradigm, namely message passing in Forney-style
factor graphs (FFGs), in the context of automated design of efficient Bayesian
signal processing algorithms. To this end, we developed "ForneyLab"
(https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message
passing-based inference in FFGs. We show by example how ForneyLab enables
automatic derivation of Bayesian signal processing algorithms, including
algorithms for parameter estimation and model comparison. Crucially, due to the
modular makeup of the FFG framework, both the model specification and inference
methods are readily extensible in ForneyLab. In order to test this framework,
we compared variational message passing as implemented by ForneyLab with
automatic differentiation variational inference (ADVI) and Monte Carlo methods
as implemented by state-of-the-art tools "Edward" and "Stan". In terms of
performance, extensibility and stability issues, ForneyLab appears to enjoy an
edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate
Reasonin
Variational approach for learning Markov processes from time series data
Inference, prediction and control of complex dynamical systems from time
series is important in many areas, including financial markets, power grid
management, climate and weather modeling, or molecular dynamics. The analysis
of such highly nonlinear dynamical systems is facilitated by the fact that we
can often find a (generally nonlinear) transformation of the system coordinates
to features in which the dynamics can be excellently approximated by a linear
Markovian model. Moreover, the large number of system variables often change
collectively on large time- and length-scales, facilitating a low-dimensional
analysis in feature space. In this paper, we introduce a variational approach
for Markov processes (VAMP) that allows us to find optimal feature mappings and
optimal Markovian models of the dynamics from given time series data. The key
insight is that the best linear model can be obtained from the top singular
components of the Koopman operator. This leads to the definition of a family of
score functions called VAMP-r which can be calculated from data, and can be
employed to optimize a Markovian model. In addition, based on the relationship
between the variational scores and approximation errors of Koopman operators,
we propose a new VAMP-E score, which can be applied to cross-validation for
hyper-parameter optimization and model selection in VAMP. VAMP is valid for
both reversible and nonreversible processes and for stationary and
non-stationary processes or realizations
Moment-Based Variational Inference for Markov Jump Processes
We propose moment-based variational inference as a flexible framework for
approximate smoothing of latent Markov jump processes. The main ingredient of
our approach is to partition the set of all transitions of the latent process
into classes. This allows to express the Kullback-Leibler divergence between
the approximate and the exact posterior process in terms of a set of moment
functions that arise naturally from the chosen partition. To illustrate
possible choices of the partition, we consider special classes of jump
processes that frequently occur in applications. We then extend the results to
parameter inference and demonstrate the method on several examples.Comment: Accepted by the 36th International Conference on Machine Learning
(ICML 2019
- …