62 research outputs found

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Exposing some points of interest about non-exposed points of desirability

    Get PDF
    We study the representation of sets of desirable gambles by sets of probability mass functions. Sets of desirable gambles are a very general uncertainty model, that may be non-Archimedean, and therefore not representable by a set of probability mass functions. Recently, Cozman (2018) has shown that imposing the additional requirement of even convexity on sets of desirable gambles guarantees that they are representable by a set of probability mass functions. Already more that 20 years earlier, Seidenfeld et al. (1995) gave an axiomatisation of binary preferences—on horse lotteries, rather than on gambles—that also leads to a unique representation in terms of sets of probability mass functions. To reach this goal, they use two devices, which we will call ‘SSK–Archimedeanity’ and ‘SSK–extension’. In this paper, we will make the arguments of Seidenfeld et al. (1995) explicit in the language of gambles, and show how their ideas imply even convexity and allow for conservative reasoning with evenly convex sets of desirable gambles, by deriving an equivalence between the SSK–Archimedean natural extension, the SSK–extension, and the evenly convex natural extension

    What Accuracy Could Not Be

    Get PDF
    Two different programs are in the business of explicating accuracy—the truthlikeness program and the epistemic utility program. Both assume that truth is the goal of inquiry, and that among inquiries that fall short of realizing the goal some get closer to it than others. TL theorists have been searching for an account of the accuracy of propositions. Epistemic utility theorists have been searching for an account of the accuracy of credal states. Both assume we can make cognitive progress in an inquiry even while falling short of the target. I show that the prospects for combining these two programs are bleak. A core accuracy principle, Proximity, that is universally embraced within the Truthlikeness program turns out to be incompatible with a central principle within the Epistemic Utility program, namely Propriety

    What Accuracy Could Not Be

    Get PDF
    Two different programs are in the business of explicating accuracy—the truthlikeness program and the epistemic utility program. Both assume that truth is the goal of inquiry, and that among inquiries that fall short of realizing the goal some get closer to it than others. TL theorists have been searching for an account of the accuracy of propositions. Epistemic utility theorists have been searching for an account of the accuracy of credal states. Both assume we can make cognitive progress in an inquiry even while falling short of the target. I show that the prospects for combining these two programs are bleak. A core accuracy principle, Proximity, that is universally embraced within the Truthlikeness program turns out to be incompatible with a central principle within the Epistemic Utility program, namely Propriety

    Imprecise probability in epistemology

    Get PDF
    There is a growing interest in the foundations as well as the application of imprecise probability in contemporary epistemology. This dissertation is concerned with the application. In particular, the research presented concerns ways in which imprecise probability, i.e. sets of probability measures, may helpfully address certain philosophical problems pertaining to rational belief. The issues I consider are disagreement among epistemic peers, complete ignorance, and inductive reasoning with imprecise priors. For each of these topics, it is assumed that belief can be modeled with imprecise probability, and thus there is a non-classical solution to be given to each problem. I argue that this is the case for peer disagreement and complete ignorance. However, I discovered that the approach has its shortcomings, too, specifically in regard to inductive reasoning with imprecise priors. Nevertheless, the dissertation ultimately illustrates that imprecise probability as a model of rational belief has a lot of promise, but one should be aware of its limitations also

    Evenly convex sets, and evenly quasiconvex functions, revisited

    Get PDF
    Since its appearance, even convexity has become a remarkable notion in convex analysis. In the fifties, W. Fenchel introduced the evenly convex sets as those sets solving linear systems containing strict inequalities. Later on, in the eighties, evenly quasiconvex functions were introduced as those whose sublevel sets are evenly convex. The significance of even convexity relies on the different areas where it enjoys applications, ranging from convex optimization to microeconomics. In this paper, we review some of the main properties of evenly convex sets and evenly quasiconvex functions, provide further characterizations of evenly convex sets, and present some new results for evenly quasiconvex functions.This research has been partially supported by MINECO of Spain and ERDF of EU, Grants PGC2018-097960-B-C22 and ECO2016-77200-P

    Interpreting, axiomatising and representing coherent choice functions in terms of desirability

    Get PDF
    Choice functions constitute a simple, direct and very general mathematical framework for modelling choice under uncertainty. In particular, they are able to represent the set-valued choices that appear in imprecise-probabilistic decision making. We provide these choice functions with a clear interpretation in terms of desirability, use this interpretation to derive a set of basic coherence axioms, and show that this notion of coherence leads to a representation in terms of sets of strict preference orders. By imposing additional properties such as totality, the mixing property and Archimedeanity, we obtain representation in terms of sets of strict total orders, lexicographic probability systems, coherent lower previsions or linear previsions

    Probabilistic Knowledge and Cognitive Ability

    Get PDF
    Moss (2013) argues that degrees of belief or credences can amount to knowledge in much the way that full beliefs can. This paper explores a new kind of objective Bayesianism designed to take us some way toward securing such knowledge-constituting credences, or 'probabilistic knowledge'. Whatever else it takes for an agent’s credences to amount to knowledge, their success, or accuracy must be the product of cognitive ability or skill. The brand of Bayesianism developed here helps ensure this ability condition is satisfied. Cognitive ability, in turn, helps make credences valuable in other ways: it helps mitigate their dependence on epistemic luck, for example. What we end up with, at the end of the day, are credences that are particularly good candidates for constituting probabilistic knowledge. What’s more, examining the character of these credences teaches us something important about what the pursuit of probabilistic knowledge demands from us. It does not demand that we give hypotheses equal treatment, by affording them equal credence. Rather, it demands that we give them equal consideration, by affording them an equal chance of being discovered
    corecore