142,053 research outputs found

    The relation between degrees of belief and binary beliefs: A general impossibility theorem

    Get PDF
    Agents are often assumed to have degrees of belief (“credences”) and also binary beliefs (“beliefs simpliciter”). How are these related to each other? A much-discussed answer asserts that it is rational to believe a proposition if and only if one has a high enough degree of belief in it. But this answer runs into the “lottery paradox”: the set of believed propositions may violate the key rationality conditions of consistency and deductive closure. In earlier work, we showed that this problem generalizes: there exists no local function from degrees of belief to binary beliefs that satisfies some minimal conditions of rationality and non-triviality. “Locality” means that the binary belief in each proposition depends only on the degree of belief in that proposition, not on the degrees of belief in others. One might think that the impossibility can be avoided by dropping the assumption that binary beliefs are a function of degrees of belief. We prove that, even if we drop the “functionality” restriction, there still exists no local relation between degrees of belief and binary beliefs that satisfies some minimal conditions. Thus functionality is not the source of the impossibility; its source is the condition of locality. If there is any non-trivial relation between degrees of belief and binary beliefs at all, it must be a “holistic” one. We explore several concrete forms this “holistic” relation could take

    A Bayesian Account of Quantum Histories

    Full text link
    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive `probabilities'. The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely the linearly positive histories originally introduced by Goldstein and Page. Thus we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.Comment: 24 pages, accepted for publication in Annals of Physics, minor correctio

    Majority voting on restricted domains

    Get PDF
    In judgment aggregation, unlike preference aggregation, not much is known about domain restrictions that guarantee consistent majority outcomes. We introduce several conditions on individual judgments sufficient for consistent majority judgments. Some are based on global orders of propositions or individuals, others on local orders, still others not on orders at all. Some generalize classic social-choice-theoretic domain conditions, others have no counterpart. Our most general condition generalizes Sen's triplewise value-restriction, itself the most general classic condition. We also prove a new characterization theorem: for a large class of domains, if there exists any aggregation function satisfying some democratic conditions, then majority voting is the unique such function. Taken together, our results support the robustness of majority rule

    Aggregation theory and the relevance of some issues to others

    Get PDF
    I propose a general collective decision problem consisting in many issues that are interconnected in two ways: by mutual constraints and by connections of relevance. Aggregate decisions should respect the mutual constraints, and be based on relevant information only. This general informational constraint has many special cases, including premise-basedness and Arrow''s independence condition; they result from special notions of relevance. The existence and nature of (non-degenerate) aggregation rules depends on both types of connections. One result, if applied to the preference aggregation problem and adopting Arrow''s notion of (ir)relevance, becomes Arrow''s Theorem, without excluding indifferences unlike in earlier generalisations.mathematical economics;

    On the Probability of Plenitude

    Get PDF
    I examine what the mathematical theory of random structures can teach us about the probability of Plenitude, a thesis closely related to David Lewis's modal realism. Given some natural assumptions, Plenitude is reasonably probable a priori, but in principle it can be (and plausibly it has been) empirically disconfirmed—not by any general qualitative evidence, but rather by our de re evidence

    Deductive Cogency, understanding, and acceptance

    Get PDF
    Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, leading a number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox

    On the Consistent Histories Approach to Quantum Mechanics

    Full text link
    We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omn\`es and Gell-Mann and Hartle, and describe the classification of consistent sets. We illustrate some general features of consistent sets by a few simple lemmas and examples. We consider various interpretations of the formalism, and examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omn\`es' characterisation of true statements --- statements which can be deduced unconditionally in his interpretation --- is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular their discussions of communication, prediction and retrodiction, and conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as yet unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions.Comment: Published version, to appear in J. Stat. Phys. in early 1996. The main arguments and conclusions remain unaltered, but there are significant revisions from the earlier archive version. These include a new subsection on interpretations of the formalism, other additions clarifying various arguments in response to comments, and some minor corrections. (87 pages, TeX with harvmac.
    • 

    corecore