16,568 research outputs found
Analytic Moment-based Gaussian Process Filtering
We propose an analytic moment-based filter for nonlinear stochastic dynamic systems modeled by Gaussian processes. Exact expressions for the expected value and the covariance matrix are provided for both the prediction step and the filter step, where an additional Gaussian assumption is exploited in the latter case. Our filter does not require further approximations. In particular, it avoids finite-sample approximations. We compare the filter to a variety of Gaussian filters, that is, the EKF, the UKF, and the recent GP-UKF proposed by Ko et al. (2007). copyright 2009
Analytic moment-based Gaussian process filtering.
We propose an analytic moment-based filter for nonlinear stochastic dynamic systems modeled by Gaussian processes. Exact expressions for the expected value and the covariance matrix are provided for both the prediction step and the filter step, where an additional Gaussian assumption is exploited in the latter case. Our filter does not require further approximations. In particular, it avoids finite-sample approximations. We compare the filter to a variety of Gaussian filters, that is, the EKF, the UKF, and the recent GP-UKF proposed by Ko et al. (2007). 1
Robust Filtering and Smoothing with Gaussian Processes
We propose a principled algorithm for robust Bayesian filtering and smoothing
in nonlinear stochastic dynamic systems when both the transition function and
the measurement function are described by non-parametric Gaussian process (GP)
models. GPs are gaining increasing importance in signal processing, machine
learning, robotics, and control for representing unknown system functions by
posterior probability distributions. This modern way of "system identification"
is more robust than finding point estimates of a parametric function
representation. In this article, we present a principled algorithm for robust
analytic smoothing in GP dynamic systems, which are increasingly used in
robotics and control. Our numerical evaluations demonstrate the robustness of
the proposed approach in situations where other state-of-the-art Gaussian
filters and smoothers can fail.Comment: 7 pages, 1 figure, draft version of paper accepted at IEEE
Transactions on Automatic Contro
A New Perspective and Extension of the Gaussian Filter
The Gaussian Filter (GF) is one of the most widely used filtering algorithms;
instances are the Extended Kalman Filter, the Unscented Kalman Filter and the
Divided Difference Filter. GFs represent the belief of the current state by a
Gaussian with the mean being an affine function of the measurement. We show
that this representation can be too restrictive to accurately capture the
dependences in systems with nonlinear observation models, and we investigate
how the GF can be generalized to alleviate this problem. To this end, we view
the GF from a variational-inference perspective. We analyse how restrictions on
the form of the belief can be relaxed while maintaining simplicity and
efficiency. This analysis provides a basis for generalizations of the GF. We
propose one such generalization which coincides with a GF using a virtual
measurement, obtained by applying a nonlinear function to the actual
measurement. Numerical experiments show that the proposed Feature Gaussian
Filter (FGF) can have a substantial performance advantage over the standard GF
for systems with nonlinear observation models.Comment: Will appear in Robotics: Science and Systems (R:SS) 201
Nonparametric Uncertainty Quantification for Stochastic Gradient Flows
This paper presents a nonparametric statistical modeling method for
quantifying uncertainty in stochastic gradient systems with isotropic
diffusion. The central idea is to apply the diffusion maps algorithm to a
training data set to produce a stochastic matrix whose generator is a discrete
approximation to the backward Kolmogorov operator of the underlying dynamics.
The eigenvectors of this stochastic matrix, which we will refer to as the
diffusion coordinates, are discrete approximations to the eigenfunctions of the
Kolmogorov operator and form an orthonormal basis for functions defined on the
data set. Using this basis, we consider the projection of three uncertainty
quantification (UQ) problems (prediction, filtering, and response) into the
diffusion coordinates. In these coordinates, the nonlinear prediction and
response problems reduce to solving systems of infinite-dimensional linear
ordinary differential equations. Similarly, the continuous-time nonlinear
filtering problem reduces to solving a system of infinite-dimensional linear
stochastic differential equations. Solving the UQ problems then reduces to
solving the corresponding truncated linear systems in finitely many diffusion
coordinates. By solving these systems we give a model-free algorithm for UQ on
gradient flow systems with isotropic diffusion. We numerically verify these
algorithms on a 1-dimensional linear gradient flow system where the analytic
solutions of the UQ problems are known. We also apply the algorithm to a
chaotically forced nonlinear gradient flow system which is known to be well
approximated as a stochastically forced gradient flow.Comment: Find the associated videos at: http://personal.psu.edu/thb11
Optimal Clustering under Uncertainty
Classical clustering algorithms typically either lack an underlying
probability framework to make them predictive or focus on parameter estimation
rather than defining and minimizing a notion of error. Recent work addresses
these issues by developing a probabilistic framework based on the theory of
random labeled point processes and characterizing a Bayes clusterer that
minimizes the number of misclustered points. The Bayes clusterer is analogous
to the Bayes classifier. Whereas determining a Bayes classifier requires full
knowledge of the feature-label distribution, deriving a Bayes clusterer
requires full knowledge of the point process. When uncertain of the point
process, one would like to find a robust clusterer that is optimal over the
uncertainty, just as one may find optimal robust classifiers with uncertain
feature-label distributions. Herein, we derive an optimal robust clusterer by
first finding an effective random point process that incorporates all
randomness within its own probabilistic structure and from which a Bayes
clusterer can be derived that provides an optimal robust clusterer relative to
the uncertainty. This is analogous to the use of effective class-conditional
distributions in robust classification. After evaluating the performance of
robust clusterers in synthetic mixtures of Gaussians models, we apply the
framework to granular imaging, where we make use of the asymptotic
granulometric moment theory for granular images to relate robust clustering
theory to the application.Comment: 19 pages, 5 eps figures, 1 tabl
Sequential Monte Carlo samplers for semilinear inverse problems and application to magnetoencephalography
We discuss the use of a recent class of sequential Monte Carlo methods for
solving inverse problems characterized by a semi-linear structure, i.e. where
the data depend linearly on a subset of variables and nonlinearly on the
remaining ones. In this type of problems, under proper Gaussian assumptions one
can marginalize the linear variables. This means that the Monte Carlo procedure
needs only to be applied to the nonlinear variables, while the linear ones can
be treated analytically; as a result, the Monte Carlo variance and/or the
computational cost decrease. We use this approach to solve the inverse problem
of magnetoencephalography, with a multi-dipole model for the sources. Here,
data depend nonlinearly on the number of sources and their locations, and
depend linearly on their current vectors. The semi-analytic approach enables us
to estimate the number of dipoles and their location from a whole time-series,
rather than a single time point, while keeping a low computational cost.Comment: 26 pages, 6 figure
- …