106,644 research outputs found

    The Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics

    Full text link
    The Jeffreys-Lindley paradox displays how the use of a p-value (or number of standard deviations z) in a frequentist hypothesis test can lead to an inference that is radically different from that of a Bayesian hypothesis test in the form advocated by Harold Jeffreys in the 1930s and common today. The setting is the test of a well-specified null hypothesis (such as the Standard Model of elementary particle physics, possibly with "nuisance parameters") versus a composite alternative (such as the Standard Model plus a new force of nature of unknown strength). The p-value, as well as the ratio of the likelihood under the null hypothesis to the maximized likelihood under the alternative, can strongly disfavor the null hypothesis, while the Bayesian posterior probability for the null hypothesis can be arbitrarily large. The academic statistics literature contains many impassioned comments on this paradox, yet there is no consensus either on its relevance to scientific communication or on its correct resolution. The paradox is quite relevant to frontier research in high energy physics. This paper is an attempt to explain the situation to both physicists and statisticians, in the hope that further progress can be made.Comment: v4: Continued editing for clarity. Figure added. v5: Minor fixes to biblio. Same as published version except for minor copy-edits, Synthese (2014). v6: fix typos, and restore garbled sentence at beginning of Sec 4 to v

    Multi-Agent Only-Knowing Revisited

    Get PDF
    Levesque introduced the notion of only-knowing to precisely capture the beliefs of a knowledge base. He also showed how only-knowing can be used to formalize non-monotonic behavior within a monotonic logic. Despite its appeal, all attempts to extend only-knowing to the many agent case have undesirable properties. A belief model by Halpern and Lakemeyer, for instance, appeals to proof-theoretic constructs in the semantics and needs to axiomatize validity as part of the logic. It is also not clear how to generalize their ideas to a first-order case. In this paper, we propose a new account of multi-agent only-knowing which, for the first time, has a natural possible-world semantics for a quantified language with equality. We then provide, for the propositional fragment, a sound and complete axiomatization that faithfully lifts Levesque's proof theory to the many agent case. We also discuss comparisons to the earlier approach by Halpern and Lakemeyer.Comment: Appears in Principles of Knowledge Representation and Reasoning 201

    Communicating answer set programs

    Get PDF
    Answer set programming i s a form of declarative programming that has proven very successful in succinctly formulating and solving complex problems. Although mechanisms for representing and reasoning with the combined answer set programs of multiple agents have already been proposed, the actual gain in expressivity when adding communication has not been thoroughly studied. We show that allowing simple programs to talk to each other results in the same expressivity as adding negation-as-failure. Furthermore, we show that the ability to focus on one program in a network of simple programs results in the same expressivity as adding disjunction in the head of the rules

    The e-mail game revisited - Modeling rough inductive reasoning

    Get PDF
    I study the robustness of Rubinstein´s (1989) E-Mail Game results towards rough inductive reasoning. Rough induction is a form of boundedly rational reasoning where a player does not carry out every inductive step. The information structure in the E-Mail game is generalized and the conditions are characterized under which Rubinstein´s results hold. Rough induction generates a payoff dominant equilibrium where the expected payoffs change continously in the probability of "faulty" communication. The article follows one of Morris´(2001a) reactions to the E-Mail game "that one should try to come up with a model of boundedly rational behavior that delivers predictions that are insensitive to whether there is common knowledge or a large number of levels of knowledge".

    From probabilities to categorical beliefs: Going beyond toy models

    Get PDF
    According to the Lockean thesis, a proposition is believed just in case it is highly probable. While this thesis enjoys strong intuitive support, it is known to conflict with seemingly plausible logical constraints on our beliefs. One way out of this conflict is to make probability 1 a requirement for belief, but most have rejected this option for entailing what they see as an untenable skepticism. Recently, two new solutions to the conflict have been proposed that are alleged to be non-skeptical. We compare these proposals with each other and with the Lockean thesis, in particular with regard to the question of how much we gain by adopting any one of them instead of the probability 1 requirement, that is, of how likely it is that one believes more than the things one is fully certain of

    Is Epistemic Permissivism a Consistent Position to Argue from?

    Get PDF

    Realism and Utopianism Revisited

    Get PDF
    For Carr, the contrast between utopians and realists was between ‘those who regard politics as a function of ethics and those who regard ethics as a function of politics’. In other words, can we direct society in benevolent directions, perhaps to a utopia, or do we take what we are given and try to rationalize this into some form of moral acceptability? In the context of International Relations, the utopian aspires to a world without war and where power is not the primary determinant of relationships. The realist is more sceptical. Broadly, the realist stresses the constraints in life; the utopian stresses the opportunities. At this level, they are not social theories but temperamental attitudes. Writing originally in 1939, Carr regarded the realists as those who understood the significance of power in the international scene and whose voices had been neglected in the interwar years. The utopians espoused a set of disparate views prevalent at that time linked by their neglect of power. Carr held these utopian positions to be impractical and dangerous. My aim in this article is to look at some versions of realism and some of utopianism, to see how they have developed today into modern variants. I ask how relevant are these traditions, if traditions they be, to the present world

    Epistemic Foundation of Stable Model Semantics

    Full text link
    Stable model semantics has become a very popular approach for the management of negation in logic programming. This approach relies mainly on the closed world assumption to complete the available knowledge and its formulation has its basis in the so-called Gelfond-Lifschitz transformation. The primary goal of this work is to present an alternative and epistemic-based characterization of stable model semantics, to the Gelfond-Lifschitz transformation. In particular, we show that stable model semantics can be defined entirely as an extension of the Kripke-Kleene semantics. Indeed, we show that the closed world assumption can be seen as an additional source of `falsehood' to be added cumulatively to the Kripke-Kleene semantics. Our approach is purely algebraic and can abstract from the particular formalism of choice as it is based on monotone operators (under the knowledge order) over bilattices only.Comment: 41 pages. To appear in Theory and Practice of Logic Programming (TPLP
    • …
    corecore