22 research outputs found

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933].uncertainty, probability, event, co-being, eventology, imprecise event

    Other uncertainty theories based on capacities

    Get PDF
    International audienceThe two main uncertainty representations in the literature that tolerate imprecision are possibility distributions and random disjunctive sets. This chapter devotes special attention to the theories that have emerged from them. The first part of the chapter discusses epistemic logic and derives the need for capturing imprecision in information representations. It bridges the gap between uncertainty theories and epistemic logic showing that imprecise probabilities subsume modalities of possibility and necessity as much as probability. The second part presents possibility and evidence theories, their origins, assumptions and semantics, discusses the connections between them and the general framework of imprecise probability. Finally, chapter points out the remaining discrepancies between the different theories regarding various basic notions, such as conditioning, independence or information fusion and the existing bridges between them

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933]

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933]

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Precise Propagation of Upper and Lower Probability Bounds in System P

    Full text link
    In this paper we consider the inference rules of System P in the framework of coherent imprecise probabilistic assessments. Exploiting our algorithms, we propagate the lower and upper probability bounds associated with the conditional assertions of a given knowledge base, automatically obtaining the precise probability bounds for the derived conclusions of the inference rules. This allows a more flexible and realistic use of System P in default reasoning and provides an exact illustration of the degradation of the inference rules when interpreted in probabilistic terms. We also examine the disjunctive Weak Rational Monotony of System P+ proposed by Adams in his extended probability logic.Comment: 8 pages -8th Intl. Workshop on Non-Monotonic Reasoning NMR'2000, April 9-11, Breckenridge, Colorad

    Symmetry of models versus models of symmetry

    Full text link
    A model for a subject's beliefs about a phenomenon may exhibit symmetry, in the sense that it is invariant under certain transformations. On the other hand, such a belief model may be intended to represent that the subject believes or knows that the phenomenon under study exhibits symmetry. We defend the view that these are fundamentally different things, even though the difference cannot be captured by Bayesian belief models. In fact, the failure to distinguish between both situations leads to Laplace's so-called Principle of Insufficient Reason, which has been criticised extensively in the literature. We show that there are belief models (imprecise probability models, coherent lower previsions) that generalise and include the Bayesian belief models, but where this fundamental difference can be captured. This leads to two notions of symmetry for such belief models: weak invariance (representing symmetry of beliefs) and strong invariance (modelling beliefs of symmetry). We discuss various mathematical as well as more philosophical aspects of these notions. We also discuss a few examples to show the relevance of our findings both to probabilistic modelling and to statistical inference, and to the notion of exchangeability in particular.Comment: 61 page

    Decision making with belief functions: Compatibility and incompatibility with the sure-thing principle

    Get PDF
    This article studies situations in which information is ambiguous and only part of it can be probabilized. It is shown that the information can be modeled through belief functions if and only if the nonprobabilizable information is subject to the principles of complete ignorance. Next the representability of decisions by belief functions on outcomes is justified by means of a neutrality axiom. The natural weakening of Savage's sure-thing principle to unambiguous events is examined and its implications for decision making are identified
    corecore