25 research outputs found
Importance of Sources using the Repeated Fusion Method and the Proportional Conflict Redistribution Rules #5 and #6
We present in this paper some examples of how to compute by hand the PCR5
fusion rule for three sources, so the reader will better understand its
mechanism. We also take into consideration the importance of sources, which is
different from the classical discounting of sources
AHP and uncertainty theories for decision making using the ER-MCDA methodology
International audienceIn this paper, we present the ER-MCDA methodology for multi-criteria decision-making based on imperfect information coming from more or less reliable and conflicting sources. The Analytic Hierarchy Process (AHP), Fuzzy Sets, Possibility and Belief Functions theories are combined to take a decision based on imprecise and uncertain evaluations of quantitative, qualitative criteria. Classical aggregation of criteria is replaced by a two-step fusion process using advanced fusion rules based on the Dezert-Smarandache Theory (DSmT) that allows to make a difference between importance, reliability and uncertainty of information sources and contents
Applying new uncertainty related theories and multicriteria decision analysis methods to snow avalanche risk management
International audienceMaking the best decision in the event of a snow avalanche encounters problems in the assessment and management process because of the lack of information and knowledge on natural phenomena and the heterogeneity and reliability of the information sources available (historical data, field measurements, and expert assessments). One major goal today is therefore to aid decision making by improving the quality, quantity, and reliability of the available information. This article presents a new method called evidential reasoning and multicriteria decision analysis (ER-MCDA) to help decision making by considering information imperfections arising from several more or less reliable and possibly conflicting sources of information. First, the principles of the existing methods are reviewed. Classical methods of multicriteria decision making and existing theories attempting to represent and propagate information imperfections are described. In a second point, we describe the principle of the ER-MCDA method combining multicriteria decision analysis (MCDA) to model the decision-making process and fuzzy sets theory, possibility theory, and evidence theory to represent, fuse and propagate information imperfections. Experts, considered more or less reliable, provide imprecise and uncertain evaluations of quantitative and qualitative criteria that are combined through information fusion. The method is applied to a simplified version of an existing system aiming to evaluate the sensitivity of avalanche sites. This new method makes it possible to consider both the importance of the information available and reliability in the decision process. It also contributes to improving traceability. Other developments designed to handle other assessment problems such as avalanche triggering conditions or data quality are in progress
Solving multiple-criteria R&D project selection problems with a data-driven evidential reasoning rule
In this paper, a likelihood based evidence acquisition approach is proposed
to acquire evidence from experts'assessments as recorded in historical
datasets. Then a data-driven evidential reasoning rule based model is
introduced to R&D project selection process by combining multiple pieces of
evidence with different weights and reliabilities. As a result, the total
belief degrees and the overall performance can be generated for ranking and
selecting projects. Finally, a case study on the R&D project selection for the
National Science Foundation of China is conducted to show the effectiveness of
the proposed model. The data-driven evidential reasoning rule based model for
project evaluation and selection (1) utilizes experimental data to represent
experts' assessments by using belief distributions over the set of final
funding outcomes, and through this historic statistics it helps experts and
applicants to understand the funding probability to a given assessment grade,
(2) implies the mapping relationships between the evaluation grades and the
final funding outcomes by using historical data, and (3) provides a way to make
fair decisions by taking experts' reliabilities into account. In the
data-driven evidential reasoning rule based model, experts play different roles
in accordance with their reliabilities which are determined by their previous
review track records, and the selection process is made interpretable and
fairer. The newly proposed model reduces the time-consuming panel review work
for both managers and experts, and significantly improves the efficiency and
quality of project selection process. Although the model is demonstrated for
project selection in the NSFC, it can be generalized to other funding agencies
or industries.Comment: 20 pages, forthcoming in International Journal of Project Management
(2019
Tracking Uncertainty Propagation from Model to Formalization: Illustration on Trust Assessment
International audienceThis paper investigates the use of the URREF ontology to characterize and track uncertainties arising within the modeling and formalization phases. Estimation of trust in reported information, a real-world problem of interest to practitioners in the field of security, was adopted for illustration purposes. A functional model of trust was developed to describe the analysis of reported information, and it was implemented with belief functions. When assessing trust in reported information, the uncertainty arises not only from the quality of sources or information content, but also due to the inability of models to capture the complex chain of interactions leading to the final outcome and to constraints imposed by the representation formalism. A primary goal of this work is to separate known approximations, imperfections and inaccuracies from potential errors, while explicitly tracking the uncertainty from the modeling to the formalization phases. A secondary goal is to illustrate how criteria of the URREF ontology can offer a basis for analyzing performances of fusion systems at early stages, ahead of implementation. Ideally, since uncertainty analysis runs dynamically, it can use the existence or absence of observed states and processes inducing uncertainty to adjust the tradeoff between precision and performance of systems on-the-fly
Analyse inter-critère basée sur les fonctions de croyance pour l'analyse GPS
International audienceIn this paper we present an application of a new Belief Function-based Inter-Criteria Analysis (BF-ICrA) approach for Global Positioning System (GPS) Surveying Problems (GSP). GPS surveying is an NP-hard problem. For designing Global Positioning System surveying network, a given set of earth points must be observed consecutively. The survey cost is the sum of the distances to go from one point to another one. This kind of problems is hard to be solved with traditional numerical methods. In this paper we use BF-ICrA to analyze an Ant Colony Optimization (ACO) algorithm developed to provide near-optimal solutions for Global Positioning System surveying problem
Kohonen-Based Credal Fusion of Optical and Radar Images for Land Cover Classification
International audienceThis paper presents a Credal algorithm to perform land cover classification from a pair of optical and radar remote sensing images. SAR (Synthetic Aperture Radar) /optical multispectral information fusion is investigated in this study for making the joint classification. The approach consists of two main steps: 1) relevant features extraction applied to each sensor in order to model the sources of information and 2) a Kohonen map-based estimation of Basic Belief Assignments (BBA) dedicated to heterogeneous data. This framework deals with co-registered images and is able to handle complete optical data as well as optical data affected by missing value due to the presence of clouds and shadows during observation. A pair of SPOT-5 and RADARSAT-2 real images is used in the evaluation, and the proposed experiment in a farming area shows very promising results in terms of classification accuracy and missing optical data reconstruction when some data are hidden by clouds
Extended PCR Rules for Dynamic Frames
In most of classical fusion problems modeled from belief functions, the frame of discernment is considered as static. This means that the set of elements in the frame and the underlying integrity constraints of the frame are fixed forever and they do not change with time. In some applications, like in target tracking for example, the use of such invariant frame is not very appropriate because it can truly change with time. So it is necessary to adapt the Proportional Conflict Redistribution fusion rules (PCR5 and PCR6) for working with dynamical frames. In this paper, we propose an extension of PCR5 and PCR6 rules for working in a frame having some non-existential integrity constraints. Such constraints on the frame can arise in tracking applications by the destruction of targets for example. We show through very simple examples how these new rules can be used for the belief revision process