1,633 research outputs found
Consistent estimation of the filtering and marginal smoothing distributions in nonparametric hidden Markov models
In this paper, we consider the filtering and smoothing recursions in
nonparametric finite state space hidden Markov models (HMMs) when the
parameters of the model are unknown and replaced by estimators. We provide an
explicit and time uniform control of the filtering and smoothing errors in
total variation norm as a function of the parameter estimation errors. We prove
that the risk for the filtering and smoothing errors may be uniformly upper
bounded by the risk of the estimators. It has been proved very recently that
statistical inference for finite state space nonparametric HMMs is possible. We
study how the recent spectral methods developed in the parametric setting may
be extended to the nonparametric framework and we give explicit upper bounds
for the L2-risk of the nonparametric spectral estimators. When the observation
space is compact, this provides explicit rates for the filtering and smoothing
errors in total variation norm. The performance of the spectral method is
assessed with simulated data for both the estimation of the (nonparametric)
conditional distribution of the observations and the estimation of the marginal
smoothing distributions.Comment: 27 pages, 2 figures. arXiv admin note: text overlap with
arXiv:1501.0478
Approximate Bayesian Computation for a Class of Time Series Models
In the following article we consider approximate Bayesian computation (ABC)
for certain classes of time series models. In particular, we focus upon
scenarios where the likelihoods of the observations and parameter are
intractable, by which we mean that one cannot evaluate the likelihood even
up-to a positive unbiased estimate. This paper reviews and develops a class of
approximation procedures based upon the idea of ABC, but, specifically
maintains the probabilistic structure of the original statistical model. This
idea is useful, in that it can facilitate an analysis of the bias of the
approximation and the adaptation of established computational methods for
parameter inference. Several existing results in the literature are surveyed
and novel developments with regards to computation are given
Kernel Bayes' rule
A nonparametric kernel-based method for realizing Bayes' rule is proposed,
based on representations of probabilities in reproducing kernel Hilbert spaces.
Probabilities are uniquely characterized by the mean of the canonical map to
the RKHS. The prior and conditional probabilities are expressed in terms of
RKHS functions of an empirical sample: no explicit parametric model is needed
for these quantities. The posterior is likewise an RKHS mean of a weighted
sample. The estimator for the expectation of a function of the posterior is
derived, and rates of consistency are shown. Some representative applications
of the kernel Bayes' rule are presented, including Baysian computation without
likelihood and filtering with a nonparametric state-space model.Comment: 27 pages, 5 figure
The iterated auxiliary particle filter
We present an offline, iterated particle filter to facilitate statistical
inference in general state space hidden Markov models. Given a model and a
sequence of observations, the associated marginal likelihood L is central to
likelihood-based inference for unknown statistical parameters. We define a
class of "twisted" models: each member is specified by a sequence of positive
functions psi and has an associated psi-auxiliary particle filter that provides
unbiased estimates of L. We identify a sequence psi* that is optimal in the
sense that the psi*-auxiliary particle filter's estimate of L has zero
variance. In practical applications, psi* is unknown so the psi*-auxiliary
particle filter cannot straightforwardly be implemented. We use an iterative
scheme to approximate psi*, and demonstrate empirically that the resulting
iterated auxiliary particle filter significantly outperforms the bootstrap
particle filter in challenging settings. Applications include parameter
estimation using a particle Markov chain Monte Carlo algorithm
Nonparametric Belief Propagation and Facial Appearance Estimation
In many applications of graphical models arising in computer vision, the hidden variables of interest are most naturally specified by continuous, non-Gaussian distributions. There exist inference algorithms for discrete approximations to these continuous distributions, but for the high-dimensional variables typically of interest, discrete inference becomes infeasible. Stochastic methods such as particle filters provide an appealing alternative. However, existing techniques fail to exploit the rich structure of the graphical models describing many vision problems. Drawing on ideas from regularized particle filters and belief propagation (BP), this paper develops a nonparametric belief propagation (NBP) algorithm applicable to general graphs. Each NBP iteration uses an efficient sampling procedure to update kernel-based approximations to the true, continuous likelihoods. The algorithm can accomodate an extremely broad class of potential functions, including nonparametric representations. Thus, NBP extends particle filtering methods to the more general vision problems that graphical models can describe. We apply the NBP algorithm to infer component interrelationships in a parts-based face model, allowing location and reconstruction of occluded features
Inference via low-dimensional couplings
We investigate the low-dimensional structure of deterministic transformations
between random variables, i.e., transport maps between probability measures. In
the context of statistics and machine learning, these transformations can be
used to couple a tractable "reference" measure (e.g., a standard Gaussian) with
a target measure of interest. Direct simulation from the desired measure can
then be achieved by pushing forward reference samples through the map. Yet
characterizing such a map---e.g., representing and evaluating it---grows
challenging in high dimensions. The central contribution of this paper is to
establish a link between the Markov properties of the target measure and the
existence of low-dimensional couplings, induced by transport maps that are
sparse and/or decomposable. Our analysis not only facilitates the construction
of transformations in high-dimensional settings, but also suggests new
inference methodologies for continuous non-Gaussian graphical models. For
instance, in the context of nonlinear state-space models, we describe new
variational algorithms for filtering, smoothing, and sequential parameter
inference. These algorithms can be understood as the natural
generalization---to the non-Gaussian case---of the square-root
Rauch-Tung-Striebel Gaussian smoother.Comment: 78 pages, 25 figure
- …