1,281 research outputs found

    Hierarchical fusion of expert opinion in the Transferable Belief Model, application on climate sensitivity

    Get PDF
    International audienceThis paper examines the fusion of conflicting and not independent expert opinion in the Transferable Belief Model. Regarding procedures that combine opinions symmetrically, when beliefs are bayesian the non-interactive disjunction works better than the non-interactive conjunction, cautious conjunction or Dempster's combination rule.Then a hierarchical fusion procedure based on the partition of experts into schools of thought is introduced, justified by the sociology of science concepts of epistemic communities and competing theories. Within groups, consonant beliefs are aggregated using the cautious conjunction operator, to pool together distinct streams of evidence without assuming that experts are independent. Across groups, the non-interactive disjunction is used, assuming that when several scientific theories compete, they can not be all true at the same time, but at least one will remain. This procedure balances points of view better than averaging: the number of experts holding a view is not essential.This is illustrated with a 16 experts real-world dataset on climate sensitivity from 1995. Climate sensitivity is a key parameter to assess the severity of the global warming issue. Comparing our findings with recent results suggests that, unfortunately, the plausibility that sensitivity is small (below 1.5C) has decreased since 1995, while the plausibility that it is above 4.5C remains high.Ce texte examine la fusion des opinions d'experts en situation de controverse scientifique, à l'aide du Modèle des Croyances Transférables.Parmi les procédures qui combinent les experts symétriquement, nous constatons que lorsque les croyances sont bayésiennes (une modélisation classique s'appuyant sur les probabilités), l'opérateur de disjonction non-interactif donne de meilleurs résultats que les autres (conjonction prudente, la conjonction non-interactive, règle de Dempster).Puis nous proposons une procédure de fusion hiérarchique. En premier lieu, une partition des experts en écoles de pensée est réalisée à l'aide des méthodes de sociologie des sciences. Puis les croyances sont agrégées à l'intérieur des groupes avec l'opérateur de conjonction prudente: on suppose que tous les experts sont fiables, mais pas qu'ils constituent des sources d'information indépendantes entre elles. Enfin les groupes sont combinés entre eux par l'opérateur de disjonction non-interactive: on suppose qu'au moins l'une des écoles de pensée s'imposera, sans dire laquelle. Cette procédure offre un meilleur équilibre des points de vue que la simple moyenne, en particulier elle ne pondère pas les opinions par le nombre d'experts qui y souscrivent.La méthode est illustrée avec un jeu de données de 1995 obtenu en interrogeant 16 experts à propos de la sensibilité climatique (le paramètre clé exprimant la gravité du problème du réchauffement global). La comparaison de nos résultats avec la littérature récente montre que, hélas, la plausibilité que ce paramètre soit relativement faible (moins que 1.5C) a diminué depuis 1995, alors que la plausibilité qu'il soit au delà de 4.5C n'a pas décru

    Finding Academic Experts on a MultiSensor Approach using Shannon's Entropy

    Full text link
    Expert finding is an information retrieval task concerned with the search for the most knowledgeable people, in some topic, with basis on documents describing peoples activities. The task involves taking a user query as input and returning a list of people sorted by their level of expertise regarding the user query. This paper introduces a novel approach for combining multiple estimators of expertise based on a multisensor data fusion framework together with the Dempster-Shafer theory of evidence and Shannon's entropy. More specifically, we defined three sensors which detect heterogeneous information derived from the textual contents, from the graph structure of the citation patterns for the community of experts, and from profile information about the academic experts. Given the evidences collected, each sensor may define different candidates as experts and consequently do not agree in a final ranking decision. To deal with these conflicts, we applied the Dempster-Shafer theory of evidence combined with Shannon's Entropy formula to fuse this information and come up with a more accurate and reliable final ranking list. Experiments made over two datasets of academic publications from the Computer Science domain attest for the adequacy of the proposed approach over the traditional state of the art approaches. We also made experiments against representative supervised state of the art algorithms. Results revealed that the proposed method achieved a similar performance when compared to these supervised techniques, confirming the capabilities of the proposed framework

    Robust analysis of uncertainty in scientific assessments

    Get PDF
    Uncertainty refers to any limitation in knowledge. Identifying and characterizing uncertainty in conclusions is important to ensure transparency and avoid over or under confidence in scientific assessments. Quantitative expressions of uncertainty are less ambiguous compared to uncertainty expressed qualitatively, or not at all. Subjective probability is an example of a quantitative expression of epistemic uncertainty, which combined with Bayesian inference makes it possible to integrate evidence and characterizes uncertainty in quantities of interest. This thesis contributes to the understanding and implementation of robust Bayesian analysis as a way to integrate expert judgment and data into assessments and quantify uncertainty by bounded probability. The robust Bayesian framework is based on sets of probability for epistemic uncertainty, where precise probability is seen as a special case. This thesis covers applications relevant for scientific assessments, including evidence synthesis and quantitative risk assessment.Paper I proposes to combine two sampling methods: iterative importance sampling and Markov chain Monte Carlo (MCMC) sampling, for quantifying uncertainty by bounded probability when Bayesian updating requires MCMC sampling. This opens up for robust Bayesian analysis to be applied to complex statistical models. To achieve this, an effective sample size of importance sampling that accounts for correlated MCMC samples is proposed. For illustration, the proposed method is applied to estimate the overall effect with bounded probability in a published meta-analysis within the Collaboration for Environmental Evidence on the effect of biomanipulation on freshwater lakes.Paper II demonstrates robust Bayesian analysis as a way to quantify uncertainty in a quantity of interest by bounded probability, and explicitly distinguishes between epistemic and aleatory uncertainty in the assessment and learn parameters by integrating evidence into the model. Robust Bayesian analysis is described as a generalization of Bayesian analysis, including Bayesian analysis through precise probability as a special case. Both analyses are applied to an intake assessment.Paper III describes a way to consider uncertainty arising from ignorance or ambiguity about bias terms in a quantitative bias analysis by characterizing bias with imprecision. This is done by specifying bias with a set of bias terms and use robust Bayesian analysis to estimate the overall effect in the meta-analysis. The approach provides a structured framework to transform qualitative judgments concerning risk of biases into quantitative expressions of uncertainty in quantitative bias analysis.Paper IV compares the effect of different diversified farming practices on biodiversity and crop yields. This is done by applying a Bayesian network meta-analysis to a new public global database from a systematic protocol on diversified farming. A portfolio analysis calibrated by the network meta-analyses showed that uncertainty about the mean performance is large compared to the variability in performance across different farms

    Advances and Applications of DSmT for Information Fusion

    Get PDF
    This book is devoted to an emerging branch of Information Fusion based on new approach for modelling the fusion problematic when the information provided by the sources is both uncertain and (highly) conflicting. This approach, known in literature as DSmT (standing for Dezert-Smarandache Theory), proposes new useful rules of combinations

    Advances and Applications of Dezert-Smarandache Theory (DSmT), Vol. 1

    Get PDF
    The Dezert-Smarandache Theory (DSmT) of plausible and paradoxical reasoning is a natural extension of the classical Dempster-Shafer Theory (DST) but includes fundamental differences with the DST. DSmT allows to formally combine any types of independent sources of information represented in term of belief functions, but is mainly focused on the fusion of uncertain, highly conflicting and imprecise quantitative or qualitative sources of evidence. DSmT is able to solve complex, static or dynamic fusion problems beyond the limits of the DST framework, especially when conflicts between sources become large and when the refinement of the frame of the problem under consideration becomes inaccessible because of vague, relative and imprecise nature of elements of it. DSmT is used in cybernetics, robotics, medicine, military, and other engineering applications where the fusion of sensors\u27 information is required

    Probabilistic Opinion Pooling with Imprecise Probabilities

    Get PDF
    The question of how the probabilistic opinions of different individuals should be aggregated to form a group opinion is controversial. But one assumption seems to be pretty much common ground: for a group of Bayesians, the representation of group opinion should itself be a unique probability distribution (Madansky 44; Lehrer and Wagner 34; McConway Journal of the American Statistical Association, 76(374), 410--414, 45; Bordley Management Science, 28(10), 1137--1148, 5; Genest et al. The Annals of Statistics, 487--501, 21; Genest and Zidek Statistical Science, 114--135, 23; Mongin Journal of Economic Theory, 66(2), 313--351, 46; Clemen and Winkler Risk Analysis, 19(2), 187--203, 7; Dietrich and List 14; Herzberg Theory and Decision, 1--19, 28). We argue that this assumption is not always in order. We show how to extend the canonical mathematical framework for pooling to cover pooling with imprecise probabilities (IP) by employing set-valued pooling functions and generalizing common pooling axioms accordingly. As a proof of concept, we then show that one IP construction satisfies a number of central pooling axioms that are not jointly satisfied by any of the standard pooling recipes on pain of triviality. Following Levi (Synthese, 62(1), 3--11, 39), we also argue that IP models admit of a much better philosophical motivation as a model of rational consensus

    Combination of Evidence in Dempster-Shafer Theory

    Full text link
    • …
    corecore