6 research outputs found
Consensus in the Presence of Multiple Opinion Leaders: Effect of Bounded Confidence
The problem of analyzing the performance of networked agents exchanging
evidence in a dynamic network has recently grown in importance. This problem
has relevance in signal and data fusion network applications and in studying
opinion and consensus dynamics in social networks. Due to its capability of
handling a wider variety of uncertainties and ambiguities associated with
evidence, we use the framework of Dempster-Shafer (DS) theory to capture the
opinion of an agent. We then examine the consensus among agents in dynamic
networks in which an agent can utilize either a cautious or receptive updating
strategy. In particular, we examine the case of bounded confidence updating
where an agent exchanges its opinion only with neighboring nodes possessing
'similar' evidence. In a fusion network, this captures the case in which nodes
only update their state based on evidence consistent with the node's own
evidence. In opinion dynamics, this captures the notions of Social Judgment
Theory (SJT) in which agents update their opinions only with other agents
possessing opinions closer to their own. Focusing on the two special DS
theoretic cases where an agent state is modeled as a Dirichlet body of evidence
and a probability mass function (p.m.f.), we utilize results from matrix
theory, graph theory, and networks to prove the existence of consensus agent
states in several time-varying network cases of interest. For example, we show
the existence of a consensus in which a subset of network nodes achieves a
consensus that is adopted by follower network nodes. Of particular interest is
the case of multiple opinion leaders, where we show that the agents do not
reach a consensus in general, but rather converge to 'opinion clusters'.
Simulation results are provided to illustrate the main results.Comment: IEEE Transactions on Signal and Information Processing Over Networks,
to appea
Recommended from our members
A Dempster-Shafer Theoretic Conditional Approach to Evidence Updating for Fusion of Hard and Soft Data
Fusion of hard data with soft data is an issue that has attracted recent attention. An effective fusion strategy requires an analytical framework that can capture the uncertainty inherent in hard. and soft data. For instance, computational linguistic parsing of text-based data generates logical propositions that inherently possess significant semantic ambiguity An effective fusion framework must exploit the respective advantages of hard. and soft data while mitigating their particular weaknesses. In this paper we describe a Dempster-Shafer theoretic approach to hard and soft data fusion that relies upon the novel conditional approach to updating. The conditional approach engenders a more flexible method that allows for tuning and adapting update strategies. When computational complexity concerns are taken into account, it also provides guidance on how evidence could be ordered for updating. This has important implications in working with models that convert propositional logic statements from text into Dempster-Shafer theoretic form
High-level Information Fusion for Constrained SMC Methods and Applications
Information Fusion is a field that studies processes utilizing data from various input sources, and techniques exploiting this data to produce estimates and knowledge about objects and situations. On the other hand, human computation is a new and evolving research area that uses human intelligence to solve computational problems that are beyond the scope of existing artificial intelligence algorithms. In previous systems, humans' role was mostly restricted for analysing a finished fusion product; however, in the current systems the role of humans is an integral element in a distributed framework, where many tasks can be accomplished by either humans or machines. Moreover, some information can be provided only by humans not machines, because the observational capabilities and opportunities for traditional electronic (hard) sensors are limited.
A source-reliability-adaptive distributed non-linear estimation method applicable to a number of distributed state estimation problems is proposed. The proposed method requires only local data exchange among neighbouring sensor nodes. It therefore provides enhanced reliability, scalability, and ease of deployment. In particular, by taking into account the estimation reliability of each sensor node at any point in time, it yields a more robust distributed estimation. To perform the Multi-Model Particle Filtering (MMPF) in an adaptive distributed manner, a Gaussian approximation of the particle cloud obtained at each sensor node, along with a weighted Consensus Propagation (CP)-based distributed data aggregation scheme, are deployed to dynamically re-weight the particle clouds.
The filtering is a soft-data-constrained variant of multi-model particle filter, and is capable of processing both soft human-generated data and conventional hard sensory data. If permanent noise occurs in the estimation provided by a sensor node, due to either a faulty sensing device or misleading soft data, the contribution of that node in the weighted consensus process is immediately reduced in order to alleviate its effect on the estimation provided by the neighbouring nodes and the entire network. The robustness of the proposed source-reliability-adaptive distributed estimation method is demonstrated through simulation results for agile target tracking scenarios. Agility here refers to cases in which the observed dynamics of targets deviate from the given probabilistic characterization.
Furthermore, the same concept is applied to model soft data constrained multiple-model Probability Hypothesis Density (PHD) filter that can track agile multiple targets with non-linear dynamics, which is a challenging problem. In this case, a Sequential Monte Carlo-Probability Hypothesis Density (SMC-PHD) filter deploys a Random Set (RS) theoretic formulation, along with Sequential Monte Carlo approximation, a variant of Bayes filtering. In general, the performance of Bayesian filtering-based methods can be enhanced by using extra information incorporated as specific constraints into the filtering process. Following the same principle, the new approach uses a constrained variant of the SMC-PHD filter, in which a fuzzy logic approach is used to transform the inherently vague human-generated data into a set of constraints. These constraints are then enforced on the filtering process by applying them as coefficients to the particles' weights. Because the human generated Soft Data (SD), reports on target-agility level, the proposed constrained-filtering approach is capable of dealing with multiple agile target tracking scenarios
Distributed Random Set Theoretic Soft/Hard Data Fusion
Research on multisensor data fusion aims at providing the enabling technology to combine
information from several sources in order to form a unifi ed picture. The literature
work on fusion of conventional data provided by non-human (hard) sensors is vast and
well-established. In comparison to conventional fusion systems where input data are generated
by calibrated electronic sensor systems with well-defi ned characteristics, research
on soft data fusion considers combining human-based data expressed preferably in unconstrained
natural language form. Fusion of soft and hard data is even more challenging, yet
necessary in some applications, and has received little attention in the past. Due to being
a rather new area of research, soft/hard data fusion is still in a
edging stage with even
its challenging problems yet to be adequately de fined and explored.
This dissertation develops a framework to enable fusion of both soft and hard data
with the Random Set (RS) theory as the underlying mathematical foundation. Random
set theory is an emerging theory within the data fusion community that, due to its powerful
representational and computational capabilities, is gaining more and more attention among
the data fusion researchers. Motivated by the unique characteristics of the random set
theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying
framework capable of processing both unconventional soft data and conventional hard data,
this dissertation argues in favor of a random set theoretic approach as the first step towards
realizing a soft/hard data fusion framework.
Several challenging problems related to soft/hard fusion systems are addressed in the
proposed framework. First, an extension of the well-known Kalman lter within random
set theory, called Kalman evidential filter (KEF), is adopted as a common data processing
framework for both soft and hard data. Second, a novel ontology (syntax+semantics)
is developed to allow for modeling soft (human-generated) data assuming target tracking
as the application. Third, as soft/hard data fusion is mostly aimed at large networks of
information processing, a new approach is proposed to enable distributed estimation of
soft, as well as hard data, addressing the scalability requirement of such fusion systems.
Fourth, a method for modeling trust in the human agents is developed, which enables the
fusion system to protect itself from erroneous/misleading soft data through discounting
such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data
fusion literature a novel soft data association algorithm is developed and deployed to extend
the proposed target tracking framework into multi-target tracking case. Finally, the
multi-target tracking framework is complemented by introducing a distributed classi fication
approach applicable to target classes described with soft human-generated data.
In addition, this dissertation presents a novel data-centric taxonomy of data fusion
methodologies. In particular, several categories of fusion algorithms have been identifi ed
and discussed based on the data-related challenging aspect(s) addressed. It is intended to
provide the reader with a generic and comprehensive view of the contemporary data fusion
literature, which could also serve as a reference for data fusion practitioners by providing
them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c
data-related challenges expected in a given application