296 research outputs found

    Unifying Practical Uncertainty Representations: II. Clouds

    Get PDF
    There exist many simple tools for jointly capturing variability and incomplete information by means of uncertainty representations. Among them are random sets, possibility distributions, probability intervals, and the more recent Ferson's p-boxes and Neumaier's clouds, both defined by pairs of possibility distributions. In the companion paper, we have extensively studied a generalized form of p-box and situated it with respect to other models . This paper focuses on the links between clouds and other representations. Generalized p-boxes are shown to be clouds with comonotonic distributions. In general, clouds cannot always be represented by random sets, in fact not even by 2-monotone (convex) capacities.Comment: 30 pages, 7 figures, Pre-print of journal paper to be published in International Journal of Approximate Reasoning (with expanded section concerning clouds and probability intervals

    Numerical Sensitivity and Efficiency in the Treatment of Epistemic and Aleatory Uncertainty

    Get PDF
    The treatment of both aleatory and epistemic uncertainty by recent methods often requires an high computational effort. In this abstract, we propose a numerical sampling method allowing to lighten the computational burden of treating the information by means of so-called fuzzy random variables

    Computing Expectations with Continuous P-Boxes: Univariate Case

    Get PDF
    Given an imprecise probabilistic model over a continuous space, computing lower/upper expectations is often computationally hard to achieve, even in simple cases. Because expectations are essential in decision making and risk analysis, tractable methods to compute them are crucial in many applications involving imprecise probabilistic models. We concentrate on p-boxes (a simple and popular model), and on the computation of lower expectations of non-monotone functions. This paper is devoted to the univariate case, that is where only one variable has uncertainty. We propose and compare two approaches : the first using general linear programming, and the second using the fact that p-boxes are special cases of random sets. We underline the complementarity of both approaches, as well as the differences.Comment: 31 pages, 6 figures, constitute an extended version of a small paper accepted in ISIPTA conference, and a preprint version of a paper accepted in IJA

    Computing expectations with p-boxes: two views of the same problem

    Get PDF
    International audienceGiven an imprecise probabilistic model over a continuous space, computing lower (upper) expectations is often computationally hard to achieve, even in simple cases. Building tractable methods to do so is thus a crucial point in applications. In this paper, we concentrate on p-boxes (a simple and popular model), and on lower expectations computed over non-monotone functions. For various particular cases, we propose tractable methods to compute approximations or exact values of these lower expectations. We found interesting to put in evidence and to compare two approaches: the first using general linear programming, and the second using the fact that p-boxes are special cases of random sets. We underline the complementarity of both approaches, as well as the differences

    Other uncertainty theories based on capacities

    Get PDF
    International audienceThe two main uncertainty representations in the literature that tolerate imprecision are possibility distributions and random disjunctive sets. This chapter devotes special attention to the theories that have emerged from them. The first part of the chapter discusses epistemic logic and derives the need for capturing imprecision in information representations. It bridges the gap between uncertainty theories and epistemic logic showing that imprecise probabilities subsume modalities of possibility and necessity as much as probability. The second part presents possibility and evidence theories, their origins, assumptions and semantics, discusses the connections between them and the general framework of imprecise probability. Finally, chapter points out the remaining discrepancies between the different theories regarding various basic notions, such as conditioning, independence or information fusion and the existing bridges between them

    Interval analysis on non-linear monotonic systems as an efficient tool to optimise fresh food packaging

    Get PDF
    IATE Axe 5 : Application intégrée de la connaissance, de l’information et des technologies permettant d’accroître la qualité et la sécurité des alimentsInternational audienceWhen few data or information are available, the validity of studies performing uncertainty analysis or robust design optimisation (i.e., parameter optimisation under uncertainty) with a probabilistic approach is questionable. This is particularly true in some agronomical fields, where parameter and variable uncertainties are often quantified by a handful of measurements or by expert opinions. In this paper, we propose a simple alternative approach based on interval analysis, which avoids the pitfalls of a classical probabilistic approach. We propose simple methods to achieve uncertainty propagation, parameter optimisation and sensitivity analysis in cases where the model satisfies some monotonic properties. As a real-world case study, we interest ourselves to the application developed in our laboratory that has motivated the present work, that is the design of sustainable food packaging preserving fresh fruits and vegetables as long as possible

    Special Cases

    Get PDF
    International audienceThis chapter reviews special cases of lower previsions, that are instrumental in practical applications. We emphasize their various advantages and drawbacks, as well as the kind of problems in which they can be the most useful

    How to Handle Missing Values in Multi-Criteria Decision Aiding?

    Get PDF
    International audienceIt is often the case in the applications of Multi-Criteria Decision Making that the values of alternatives are unknown on some attributes. An interesting situation arises when the attributes having missing values are actually not relevant and shall thus be removed from the model. Given a model that has been elicited on the complete set of attributes, we are looking thus for a way-called restriction operator-to automatically remove the missing attributes from this model. Axiomatic characterizations are proposed for three classes of models. For general quantitative models, the restriction operator is characterized by linearity, recursivity and decomposition on variables. The second class is the set of monotone quantitative models satisfying normal-ization conditions. The linearity axiom is changed to fit with these conditions. Adding recursivity and symmetry, the restriction operator takes the form of a normalized average. For the last class of models-namely the Choquet integral, we obtain a simpler expression. Finally, a very intuitive interpretation is provided for this last model

    Idempotent conjunctive combination of belief functions: Extending the minimum rule of possibility theory.

    Get PDF
    IATE : Axe 5 Application intĂ©grĂ©e de la connaissance, de l’information et des technologies permettant d’accroĂ®tre la qualitĂ© et la sĂ©curitĂ© des aliments Contact : [email protected] (S. Destercke), [email protected] (D. Dubois) Fax: +33 0 4 9961 3076.International audienceWhen conjunctively merging two belief functions concerning a single variable but coming from different sources, Dempster rule of combination is justified only when information sources can be considered as independent. When dependencies between sources are ill-known, it is usual to require the property of idempotence for the merging of belief functions, as this property captures the possible redundancy of dependent sources. To study idempotent merging, different strategies can be followed. One strategy is to rely on idempotent rules used in either more general or more specific frameworks and to study, respectively, their particularisation or extension to belief functions. In this paper, we study the feasibility of extending the idempotent fusion rule of possibility theory (the minimum) to belief functions. We first investigate how comparisons of information content, in the form of inclusion and least-commitment, can be exploited to relate idempotent merging in possibility theory to evidence theory. We reach the conclusion that unless we accept the idea that the result of the fusion process can be a family of belief functions, such an extension is not always possible. As handling such families seems impractical, we then turn our attention to a more quantitative criterion and consider those combinations that maximise the expected cardinality of the joint belief functions, among the least committed ones, taking advantage of the fact that the expected cardinality of a belief function only depends on its contour function
    • …
    corecore