190 research outputs found

    Probabilistic entailment in the setting of coherence: The role of quasi conjunction and inclusion relation

    Full text link
    In this paper, by adopting a coherence-based probabilistic approach to default reasoning, we focus the study on the logical operation of quasi conjunction and the Goodman-Nguyen inclusion relation for conditional events. We recall that quasi conjunction is a basic notion for defining consistency of conditional knowledge bases. By deepening some results given in a previous paper we show that, given any finite family of conditional events F and any nonempty subset S of F, the family F p-entails the quasi conjunction C(S); then, given any conditional event E|H, we analyze the equivalence between p-entailment of E|H from F and p-entailment of E|H from C(S), where S is some nonempty subset of F. We also illustrate some alternative theorems related with p-consistency and p-entailment. Finally, we deepen the study of the connections between the notions of p-entailment and inclusion relation by introducing for a pair (F,E|H) the (possibly empty) class K of the subsets S of F such that C(S) implies E|H. We show that the class K satisfies many properties; in particular K is additive and has a greatest element which can be determined by applying a suitable algorithm

    Extropy: Complementary Dual of Entropy

    Get PDF
    This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments of the refinement of a distribution, the axiom which concerned Shannon and Jaynes. Their duality is specified via the relationship among the entropies and extropies of course and fine partitions. We also analyze the extropy function for densities, showing that relative extropy constitutes a dual to the Kullback-Leibler divergence, widely recognized as the continuous entropy measure. These results are unified within the general structure of Bregman divergences. In this context they identify half the L2L_2 metric as the extropic dual to the entropic directed distance. We describe a statistical application to the scoring of sequential forecast distributions which provoked the discovery.Comment: Published at http://dx.doi.org/10.1214/14-STS430 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The consumer’s demand functions defined to study contingent consumption plans. Summarized probability distributions: a mathematical application to contingent consumption choices

    Get PDF
    Given two probability distributions expressing returns on two single risky assets of a portfolio, we innovatively define two consumer’s demand functions connected with two contingent consumption plans. This thing is possible whenever we coherently summarize every probability distribution being chosen by the consumer. Since prevision choices are consumption choices being made by the consumer inside of a metric space, we show that prevision choices can be studied by means of the standard economic model of consumer behavior. Such a model implies that we consider all coherent previsions of a joint distribution. They are decomposed inside of a metric space. Such a space coincides with the consumer’s consumption space. In this paper, we do not consider a joint distribution only. It follows that we innovatively define a stand-alone and double risky asset. Different summary measures of it characterizing consumption choices being made by the consumer can then be studied inside of a linear space over ℝ. We show that it is possible to obtain different summary measures of probability distributions by using two different quadratic metrics. In this paper, our results are based on a particular approach to the origin of the variability of probability distributions. We realize that it is not standardized, but it always depends on the state of information and knowledge of the consumer

    Towards an epistemic theory of probability.

    Get PDF
    The main concern of this thesis is to develop an epistemic conception of probability. In chapter one we look at Ramsey's work. In addition to his claim that the axioms of probability ace laws of consistency for partial beliefs, we focus attention on his view that the reasonableness of our probability statements does not consist merely in such coherence, but is to be assessed through the vindication of the habits which give rise to them. In chapter two we examine de Finetti's account, and compare it with Ramsey's. One significant point of divergence is de Finetti's claim that coherence is the only valid form of appraisal for probability statements. His arguments for this position depend heavily on the implementation of a Bayesian model for belief change; we argue that such an approach fails to give a satisfactory account of the relation between probabilities and objective facts. In chapter three we stake out the ground for oar own positive proposals - for an account which is non-objective in so far as it does not require the postulation of probabilistic facts, but non-subjective in the sense that probability statements are open to objective forms of appraisal. we suggest that a certain class of probability statements are best interpreted as recommendations of partial belief; these being measurable by the betting quotients that one judges to be fair. Moreover, we argue that these probability statements are open to three main forms of appraisal (each quantifiable through the use of proper scoring rules), namely: (i) Coherence (ii) Calibration (iii) Refinement. The latter two forms of appraisal are applicable both in an ex ante sense (relative to the information known by the forecaster) and an ex post one (relative to the results of the events forecast). In chapters four and five we consider certain problems which confront theories of partial belief; in particular, (1) difficulties surrounding the justification of the rule to maximise one's information, and (2) problems with the ascription of probabilities to mathematical propositions. Both of these issues seem resolvable; the first through the principle of maximising expected utility (SEU), and the second either by amending the axioms of probability, or by making use of the notion that probabilities are appraisable via scoring rules. There do remain, however, various difficulties with SEU, in particular with respect to its application in real-life situations. These are discussed, but no final conclusion reached, except that an epistemic theory such as ours is not undermined by the inapplicability of SEU in certain situations

    Imprecise probability in epistemology

    Get PDF
    There is a growing interest in the foundations as well as the application of imprecise probability in contemporary epistemology. This dissertation is concerned with the application. In particular, the research presented concerns ways in which imprecise probability, i.e. sets of probability measures, may helpfully address certain philosophical problems pertaining to rational belief. The issues I consider are disagreement among epistemic peers, complete ignorance, and inductive reasoning with imprecise priors. For each of these topics, it is assumed that belief can be modeled with imprecise probability, and thus there is a non-classical solution to be given to each problem. I argue that this is the case for peer disagreement and complete ignorance. However, I discovered that the approach has its shortcomings, too, specifically in regard to inductive reasoning with imprecise priors. Nevertheless, the dissertation ultimately illustrates that imprecise probability as a model of rational belief has a lot of promise, but one should be aware of its limitations also
    • …
    corecore