7 research outputs found
Compressed particle methods for expensive models with application in Astronomy and Remote Sensing
In many inference problems, the evaluation of complex and costly models is
often required. In this context, Bayesian methods have become very popular in
several fields over the last years, in order to obtain parameter inversion,
model selection or uncertainty quantification. Bayesian inference requires the
approximation of complicated integrals involving (often costly) posterior
distributions. Generally, this approximation is obtained by means of Monte
Carlo (MC) methods. In order to reduce the computational cost of the
corresponding technique, surrogate models (also called emulators) are often
employed. Another alternative approach is the so-called Approximate Bayesian
Computation (ABC) scheme. ABC does not require the evaluation of the costly
model but the ability to simulate artificial data according to that model.
Moreover, in ABC, the choice of a suitable distance between real and artificial
data is also required. In this work, we introduce a novel approach where the
expensive model is evaluated only in some well-chosen samples. The selection of
these nodes is based on the so-called compressed Monte Carlo (CMC) scheme. We
provide theoretical results supporting the novel algorithms and give empirical
evidence of the performance of the proposed method in several numerical
experiments. Two of them are real-world applications in astronomy and satellite
remote sensing.Comment: published in IEEE Transactions on Aerospace and Electronic System
A Survey of Recent Advances in Particle Filters and Remaining Challenges for Multitarget Tracking
[EN]We review some advances of the particle filtering (PF) algorithm that have been achieved
in the last decade in the context of target tracking, with regard to either a single target or multiple
targets in the presence of false or missing data. The first part of our review is on remarkable
achievements that have been made for the single-target PF from several aspects including importance
proposal, computing efficiency, particle degeneracy/impoverishment and constrained/multi-modal
systems. The second part of our review is on analyzing the intractable challenges raised within
the general multitarget (multi-sensor) tracking due to random target birth and termination, false
alarm, misdetection, measurement-to-track (M2T) uncertainty and track uncertainty. The mainstream
multitarget PF approaches consist of two main classes, one based on M2T association approaches and
the other not such as the finite set statistics-based PF. In either case, significant challenges remain due
to unknown tracking scenarios and integrated tracking management
Group Importance Sampling for Particle Filtering and MCMC
Bayesian methods and their implementations by means of sophisticated Monte
Carlo techniques have become very popular in signal processing over the last
years. Importance Sampling (IS) is a well-known Monte Carlo technique that
approximates integrals involving a posterior distribution by means of weighted
samples. In this work, we study the assignation of a single weighted sample
which compresses the information contained in a population of weighted samples.
Part of the theory that we present as Group Importance Sampling (GIS) has been
employed implicitly in different works in the literature. The provided analysis
yields several theoretical and practical consequences. For instance, we discuss
the application of GIS into the Sequential Importance Resampling framework and
show that Independent Multiple Try Metropolis schemes can be interpreted as a
standard Metropolis-Hastings algorithm, following the GIS approach. We also
introduce two novel Markov Chain Monte Carlo (MCMC) techniques based on GIS.
The first one, named Group Metropolis Sampling method, produces a Markov chain
of sets of weighted samples. All these sets are then employed for obtaining a
unique global estimator. The second one is the Distributed Particle
Metropolis-Hastings technique, where different parallel particle filters are
jointly used to drive an MCMC algorithm. Different resampled trajectories are
compared and then tested with a proper acceptance probability. The novel
schemes are tested in different numerical experiments such as learning the
hyperparameters of Gaussian Processes, two localization problems in a wireless
sensor network (with synthetic and real data) and the tracking of vegetation
parameters given satellite observations, where they are compared with several
benchmark Monte Carlo techniques. Three illustrative Matlab demos are also
provided.Comment: To appear in Digital Signal Processing. Related Matlab demos are
provided at https://github.com/lukafree/GIS.gi
Marginal likelihood computation for model selection and hypothesis testing: an extensive review
This is an up-to-date introduction to, and overview of, marginal likelihood
computation for model selection and hypothesis testing. Computing normalizing
constants of probability models (or ratio of constants) is a fundamental issue
in many applications in statistics, applied mathematics, signal processing and
machine learning. This article provides a comprehensive study of the
state-of-the-art of the topic. We highlight limitations, benefits, connections
and differences among the different techniques. Problems and possible solutions
with the use of improper priors are also described. Some of the most relevant
methodologies are compared through theoretical comparisons and numerical
experiments.Comment: Keywords: Marginal likelihood, Bayesian evidence, numerical
integration, model selection, hypothesis testing, quadrature rules,
double-intractable posteriors, partition function