29 research outputs found

    Classification of Message Spreading in a Heterogeneous Social Network

    Get PDF
    Nowadays, social networks such as Twitter, Facebook and LinkedIn become increasingly popular. In fact, they introduced new habits, new ways of communication and they collect every day several information that have different sources. Most existing research works fo-cus on the analysis of homogeneous social networks, i.e. we have a single type of node and link in the network. However, in the real world, social networks offer several types of nodes and links. Hence, with a view to preserve as much information as possible, it is important to consider so-cial networks as heterogeneous and uncertain. The goal of our paper is to classify the social message based on its spreading in the network and the theory of belief functions. The proposed classifier interprets the spread of messages on the network, crossed paths and types of links. We tested our classifier on a real word network that we collected from Twitter, and our experiments show the performance of our belief classifier

    Hierarchical fusion of expert opinion in the Transferable Belief Model, application on climate sensitivity

    Get PDF
    International audienceThis paper examines the fusion of conflicting and not independent expert opinion in the Transferable Belief Model. Regarding procedures that combine opinions symmetrically, when beliefs are bayesian the non-interactive disjunction works better than the non-interactive conjunction, cautious conjunction or Dempster's combination rule.Then a hierarchical fusion procedure based on the partition of experts into schools of thought is introduced, justified by the sociology of science concepts of epistemic communities and competing theories. Within groups, consonant beliefs are aggregated using the cautious conjunction operator, to pool together distinct streams of evidence without assuming that experts are independent. Across groups, the non-interactive disjunction is used, assuming that when several scientific theories compete, they can not be all true at the same time, but at least one will remain. This procedure balances points of view better than averaging: the number of experts holding a view is not essential.This is illustrated with a 16 experts real-world dataset on climate sensitivity from 1995. Climate sensitivity is a key parameter to assess the severity of the global warming issue. Comparing our findings with recent results suggests that, unfortunately, the plausibility that sensitivity is small (below 1.5C) has decreased since 1995, while the plausibility that it is above 4.5C remains high.Ce texte examine la fusion des opinions d'experts en situation de controverse scientifique, à l'aide du Modèle des Croyances Transférables.Parmi les procédures qui combinent les experts symétriquement, nous constatons que lorsque les croyances sont bayésiennes (une modélisation classique s'appuyant sur les probabilités), l'opérateur de disjonction non-interactif donne de meilleurs résultats que les autres (conjonction prudente, la conjonction non-interactive, règle de Dempster).Puis nous proposons une procédure de fusion hiérarchique. En premier lieu, une partition des experts en écoles de pensée est réalisée à l'aide des méthodes de sociologie des sciences. Puis les croyances sont agrégées à l'intérieur des groupes avec l'opérateur de conjonction prudente: on suppose que tous les experts sont fiables, mais pas qu'ils constituent des sources d'information indépendantes entre elles. Enfin les groupes sont combinés entre eux par l'opérateur de disjonction non-interactive: on suppose qu'au moins l'une des écoles de pensée s'imposera, sans dire laquelle. Cette procédure offre un meilleur équilibre des points de vue que la simple moyenne, en particulier elle ne pondère pas les opinions par le nombre d'experts qui y souscrivent.La méthode est illustrée avec un jeu de données de 1995 obtenu en interrogeant 16 experts à propos de la sensibilité climatique (le paramètre clé exprimant la gravité du problème du réchauffement global). La comparaison de nos résultats avec la littérature récente montre que, hélas, la plausibilité que ce paramètre soit relativement faible (moins que 1.5C) a diminué depuis 1995, alors que la plausibilité qu'il soit au delà de 4.5C n'a pas décru

    Random sets and exact confidence regions

    Full text link
    An important problem in statistics is the construction of confidence regions for unknown parameters. In most cases, asymptotic distribution theory is used to construct confidence regions, so any coverage probability claims only hold approximately, for large samples. This paper describes a new approach, using random sets, which allows users to construct exact confidence regions without appeal to asymptotic theory. In particular, if the user-specified random set satisfies a certain validity property, confidence regions obtained by thresholding the induced data-dependent plausibility function are shown to have the desired coverage probability.Comment: 14 pages, 2 figure

    Risk-informed decision-making in the presence of epistemic uncertainty

    Get PDF
    International audienceAn important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte-Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we therefore propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decision-maker's attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process

    Multisensor data fusion and belief functions for robust singularity detection in signals

    Get PDF
    This paper addresses the problem of robust detection of signal singularity in hostile environments using multisensor data fusion. Measurement uncertainty is usually treated in a probabilistic way, assuming lack of knowledge is totally due to random effects. However, this approach fails when other effects, such as sensor failure, are involved. In order to improve the robustness of singularity detection, an evidence theory based approach is proposed for both modeling (data alignment) and merging (data fusion) information coming from multiple redundant sensors. Whereas the fusion step is done classically, the proposed method for data alignment has been designed to improve singularity detection performances in multisensor cases. Several case studies have been designed to suit real life situations. Results provided by both probabilistic and evidential approaches are compared. Evidential methods show better behavior facing sensors dysfunction and the proposed method takes fully advantage of component redundancy

    Belief Functions: Theory and Algorithms

    Get PDF
    The subject of this thesis is belief function theory and its application in different contexts. Belief function theory can be interpreted as a generalization of Bayesian probability theory and makes it possible to distinguish between different types of uncertainty. In this thesis, applications of belief function theory are explored both on a theoretical and on an algorithmic level. The problem of exponential complexity associated with belief function inference is addressed in this thesis by showing how efficient algorithms can be developed based on Monte-Carlo approximations and exploitation of independence. The effectiveness of these algorithms is demonstrated in applications to particle filtering, simultaneous localization and mapping, and active classification

    Representing partial ignorance

    Full text link

    Distributed Detection and Fusion in Parallel Sensor Architectures

    Get PDF
    Parallel distributed detection system consists of several separate sensor-detector nodes (separated spatially or by their principles of operation), each with some processing capabilities. These local sensor-detectors send some information on an observed phenomenon to a centrally located Data Fusion Center for aggregation and decision making. Often, the local sensors use electro-mechanical, optical or RF modalities and are known as ``hard'' sensors. For such data sources, the sensor observations have structure and often some tractable statistical distributions which help in weighing their contribution to an integrated global decision. In a distributed detection environment, we often also have ``humans in the loop.''. Humans provide their subjective opinions on these phenomena. These opinions are labeled ``soft'' data. It is of interest to integrate "soft'' decisions, mostly assessments provided by humans, with data from the "hard" sensors, in order to improve global decision reliability. Several techniques were developed to combine data from traditional hard sensors, and a body of work was also created about integration of "soft'' data. However relatively little work was done on combining hard and soft data and decisions in an integrated environment. Our work investigates both "hard'' and "hard/soft'' fusion schemes, and proposes data integration architectures to facilitate heterogeneous sensor data fusion. In the context of "hard'' fusion, one of the contributions of this thesis is an algorithm that provides a globally optimum solution for local detector (hard sensor) design that satisfies a Neyman-Pearson criterion (maximal probability of detection under a fixed upper bound on the global false alarm rate) at the fusion center. Furthermore, the thesis also delves into application of distributed detection techniques in both parallel and sequential frameworks. Specifically, we apply parallel detection and fusion schemes to the problem of real time computer user authentication and sequential Kalman filtering for real time hypoxia detection. In the context of "hard/soft'' fusion, we propose a new Dempster-Shafer evidence theory based approach to facilitate heterogeneous sensor data fusion. Application of the framework to a number of simulated example scenarios showcases the wide range of applicability of the developed approach. We also propose and develop a hierarchical evidence tree based architecture for representing nested human opinions. The proposed framework is versatile enough to deal with both hard and soft source data using the evidence theory framework, it can handle uncertainty as well as data aggregation.Ph.D., Electrical Engineering -- Drexel University, 201
    corecore