27 research outputs found
Decentralized Riemannian Particle Filtering with Applications to Multi-Agent Localization
The primary focus of this research is to develop consistent nonlinear decentralized particle filtering approaches to the problem of multiple agent localization. A key aspect in our development is the use of Riemannian geometry to exploit the inherently non-Euclidean characteristics that are typical when considering multiple agent localization scenarios. A decentralized formulation is considered due to the practical advantages it provides over centralized fusion architectures. Inspiration is taken from the relatively new field of information geometry and the more established research field of computer vision. Differential geometric tools such as manifolds, geodesics, tangent spaces, exponential, and logarithmic mappings are used extensively to describe probabilistic quantities. Numerous probabilistic parameterizations were identified, settling on the efficient square-root probability density function parameterization. The square-root parameterization has the benefit of allowing filter calculations to be carried out on the well studied Riemannian unit hypersphere. A key advantage for selecting the unit hypersphere is that it permits closed-form calculations, a characteristic that is not shared by current solution approaches. Through the use of the Riemannian geometry of the unit hypersphere, we are able to demonstrate the ability to produce estimates that are not overly optimistic. Results are presented that clearly show the ability of the proposed approaches to outperform current state-of-the-art decentralized particle filtering methods. In particular, results are presented that emphasize the achievable improvement in estimation error, estimator consistency, and required computational burden
Distributed Random Set Theoretic Soft/Hard Data Fusion
Research on multisensor data fusion aims at providing the enabling technology to combine
information from several sources in order to form a unifi ed picture. The literature
work on fusion of conventional data provided by non-human (hard) sensors is vast and
well-established. In comparison to conventional fusion systems where input data are generated
by calibrated electronic sensor systems with well-defi ned characteristics, research
on soft data fusion considers combining human-based data expressed preferably in unconstrained
natural language form. Fusion of soft and hard data is even more challenging, yet
necessary in some applications, and has received little attention in the past. Due to being
a rather new area of research, soft/hard data fusion is still in a
edging stage with even
its challenging problems yet to be adequately de fined and explored.
This dissertation develops a framework to enable fusion of both soft and hard data
with the Random Set (RS) theory as the underlying mathematical foundation. Random
set theory is an emerging theory within the data fusion community that, due to its powerful
representational and computational capabilities, is gaining more and more attention among
the data fusion researchers. Motivated by the unique characteristics of the random set
theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying
framework capable of processing both unconventional soft data and conventional hard data,
this dissertation argues in favor of a random set theoretic approach as the first step towards
realizing a soft/hard data fusion framework.
Several challenging problems related to soft/hard fusion systems are addressed in the
proposed framework. First, an extension of the well-known Kalman lter within random
set theory, called Kalman evidential filter (KEF), is adopted as a common data processing
framework for both soft and hard data. Second, a novel ontology (syntax+semantics)
is developed to allow for modeling soft (human-generated) data assuming target tracking
as the application. Third, as soft/hard data fusion is mostly aimed at large networks of
information processing, a new approach is proposed to enable distributed estimation of
soft, as well as hard data, addressing the scalability requirement of such fusion systems.
Fourth, a method for modeling trust in the human agents is developed, which enables the
fusion system to protect itself from erroneous/misleading soft data through discounting
such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data
fusion literature a novel soft data association algorithm is developed and deployed to extend
the proposed target tracking framework into multi-target tracking case. Finally, the
multi-target tracking framework is complemented by introducing a distributed classi fication
approach applicable to target classes described with soft human-generated data.
In addition, this dissertation presents a novel data-centric taxonomy of data fusion
methodologies. In particular, several categories of fusion algorithms have been identifi ed
and discussed based on the data-related challenging aspect(s) addressed. It is intended to
provide the reader with a generic and comprehensive view of the contemporary data fusion
literature, which could also serve as a reference for data fusion practitioners by providing
them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c
data-related challenges expected in a given application
Invertible Particle Flow-based Sequential MCMC with extension to Gaussian Mixture noise models
Sequential state estimation in non-linear and non-Gaussian state spaces has a
wide range of applications in statistics and signal processing. One of the most
effective non-linear filtering approaches, particle filtering, suffers from
weight degeneracy in high-dimensional filtering scenarios. Several avenues have
been pursued to address high-dimensionality. Among these, particle flow
particle filters construct effective proposal distributions by using invertible
flow to migrate particles continuously from the prior distribution to the
posterior, and sequential Markov chain Monte Carlo (SMCMC) methods use a
Metropolis-Hastings (MH) accept-reject approach to improve filtering
performance. In this paper, we propose to combine the strengths of invertible
particle flow and SMCMC by constructing a composite Metropolis-Hastings (MH)
kernel within the SMCMC framework using invertible particle flow. In addition,
we propose a Gaussian mixture model (GMM)-based particle flow algorithm to
construct effective MH kernels for multi-modal distributions. Simulation
results show that for high-dimensional state estimation example problems the
proposed kernels significantly increase the acceptance rate with minimal
additional computational overhead and improve estimation accuracy compared with
state-of-the-art filtering algorithms
State Estimation for Distributed Systems with Stochastic and Set-membership Uncertainties
State estimation techniques for centralized, distributed, and decentralized systems are studied. An easy-to-implement state estimation concept is introduced that generalizes and combines basic principles of Kalman filter theory and ellipsoidal calculus. By means of this method, stochastic and set-membership uncertainties can be taken into consideration simultaneously. Different solutions for implementing these estimation algorithms in distributed networked systems are presented
An evolutionary approach to optimising neural network predictors for passive sonar target tracking
Object tracking is important in autonomous robotics, military applications, financial
time-series forecasting, and mobile systems. In order to correctly track through clutter,
algorithms which predict the next value in a time series are essential.
The competence of standard machine learning techniques to create bearing prediction
estimates was examined. The results show that the classification based algorithms
produce more accurate estimates than the state-of-the-art statistical models. Artificial
Neural Networks (ANNs) and K-Nearest Neighbour were used, demonstrating that this
technique is not specific to a single classifier. [Continues.
Advances in Evolutionary Algorithms
With the recent trends towards massive data sets and significant computational power, combined with evolutionary algorithmic advances evolutionary computation is becoming much more relevant to practice. Aim of the book is to present recent improvements, innovative ideas and concepts in a part of a huge EA field