100 research outputs found
Decision-Making with Belief Functions: a Review
Approaches to decision-making under uncertainty in the belief function
framework are reviewed. Most methods are shown to blend criteria for decision
under ignorance with the maximum expected utility principle of Bayesian
decision theory. A distinction is made between methods that construct a
complete preference relation among acts, and those that allow incomparability
of some acts due to lack of information. Methods developed in the imprecise
probability framework are applicable in the Dempster-Shafer context and are
also reviewed. Shafer's constructive decision theory, which substitutes the
notion of goal for that of utility, is described and contrasted with other
approaches. The paper ends by pointing out the need to carry out deeper
investigation of fundamental issues related to decision-making with belief
functions and to assess the descriptive, normative and prescriptive values of
the different approaches
Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected Works), Vol. 4
The fourth volume on Advances and Applications of Dezert-Smarandache Theory (DSmT) for information fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics. The contributions (see List of Articles published in this book, at the end of the volume) have been published or presented after disseminating the third volume (2009, http://fs.unm.edu/DSmT-book3.pdf) in international conferences, seminars, workshops and journals.
First Part of this book presents the theoretical advancement of DSmT, dealing with Belief functions, conditioning and deconditioning, Analytic Hierarchy Process, Decision Making, Multi-Criteria, evidence theory, combination rule, evidence distance, conflicting belief, sources of evidences with different importance and reliabilities, importance of sources, pignistic probability transformation, Qualitative reasoning under uncertainty, Imprecise belief
structures, 2-Tuple linguistic label, Electre Tri Method, hierarchical proportional redistribution, basic belief assignment, subjective probability measure, Smarandache codification, neutrosophic logic, Evidence theory, outranking methods, Dempster-Shafer Theory, Bayes fusion rule, frequentist probability, mean square error, controlling factor, optimal assignment solution, data association, Transferable Belief Model, and others.
More applications of DSmT have emerged in the past years since the apparition of the third book of DSmT 2009. Subsequently, the second part of this volume is about applications of DSmT in correlation with Electronic Support Measures, belief function, sensor networks, Ground Moving Target and Multiple target tracking, Vehicle-Born Improvised Explosive Device, Belief Interacting Multiple Model filter, seismic and acoustic sensor, Support Vector Machines, Alarm
classification, ability of human visual system, Uncertainty Representation and Reasoning Evaluation Framework, Threat Assessment, Handwritten Signature Verification, Automatic Aircraft Recognition, Dynamic Data-Driven Application System, adjustment of secure communication trust analysis, and so on.
Finally, the third part presents a List of References related with DSmT published or presented along the years since its inception in 2004, chronologically ordered
Building a binary outranking relation in uncertain, imprecise and multi-experts contexts: The application of evidence theory
AbstractWe consider multicriteria decision problems where the actions are evaluated on a set of ordinal criteria. The evaluation of each alternative with respect to each criterion may be uncertain and/or imprecise and is provided by one or several experts. We model this evaluation as a basic belief assignment (BBA). In order to compare the different pairs of alternatives according to each criterion, the concept of first belief dominance is proposed. Additionally, criteria weights are also expressed by means of a BBA. A model inspired by ELECTRE I is developed and illustrated by a pedagogical example
Distributed Random Set Theoretic Soft/Hard Data Fusion
Research on multisensor data fusion aims at providing the enabling technology to combine
information from several sources in order to form a unifi ed picture. The literature
work on fusion of conventional data provided by non-human (hard) sensors is vast and
well-established. In comparison to conventional fusion systems where input data are generated
by calibrated electronic sensor systems with well-defi ned characteristics, research
on soft data fusion considers combining human-based data expressed preferably in unconstrained
natural language form. Fusion of soft and hard data is even more challenging, yet
necessary in some applications, and has received little attention in the past. Due to being
a rather new area of research, soft/hard data fusion is still in a
edging stage with even
its challenging problems yet to be adequately de fined and explored.
This dissertation develops a framework to enable fusion of both soft and hard data
with the Random Set (RS) theory as the underlying mathematical foundation. Random
set theory is an emerging theory within the data fusion community that, due to its powerful
representational and computational capabilities, is gaining more and more attention among
the data fusion researchers. Motivated by the unique characteristics of the random set
theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying
framework capable of processing both unconventional soft data and conventional hard data,
this dissertation argues in favor of a random set theoretic approach as the first step towards
realizing a soft/hard data fusion framework.
Several challenging problems related to soft/hard fusion systems are addressed in the
proposed framework. First, an extension of the well-known Kalman lter within random
set theory, called Kalman evidential filter (KEF), is adopted as a common data processing
framework for both soft and hard data. Second, a novel ontology (syntax+semantics)
is developed to allow for modeling soft (human-generated) data assuming target tracking
as the application. Third, as soft/hard data fusion is mostly aimed at large networks of
information processing, a new approach is proposed to enable distributed estimation of
soft, as well as hard data, addressing the scalability requirement of such fusion systems.
Fourth, a method for modeling trust in the human agents is developed, which enables the
fusion system to protect itself from erroneous/misleading soft data through discounting
such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data
fusion literature a novel soft data association algorithm is developed and deployed to extend
the proposed target tracking framework into multi-target tracking case. Finally, the
multi-target tracking framework is complemented by introducing a distributed classi fication
approach applicable to target classes described with soft human-generated data.
In addition, this dissertation presents a novel data-centric taxonomy of data fusion
methodologies. In particular, several categories of fusion algorithms have been identifi ed
and discussed based on the data-related challenging aspect(s) addressed. It is intended to
provide the reader with a generic and comprehensive view of the contemporary data fusion
literature, which could also serve as a reference for data fusion practitioners by providing
them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c
data-related challenges expected in a given application
- …