2,952 research outputs found

    Discrete transfinite computation

    Get PDF

    Conservation theorems for the Cohesiveness Principle

    Full text link
    We prove that the Cohesiveness Principle (COH) is Π11\Pi^1_1 conservative over RCA0+IΣn0RCA_0 + I\Sigma^0_n and over RCA0+BΣn0RCA_0 + B\Sigma^0_n for all n≥2n \geq 2 by recursion-theoretic means. We first characterize COH over RCA0+BΣ20RCA_0 + B\Sigma^0_2 as a `jumped' version of Weak K\"{o}nig's Lemma (WKL) and develop suitable machinery including a version of the Friedberg jump-inversion theorem. The main theorem is obtained when we combine these with known results about WKL. In an appendix we give a proof of the Π11\Pi^1_1 conservativity of WKL over RCA0RCA_0 by way of the Superlow Basis Theorem and a new proof of a recent jump-inversion theorem of Towsner

    Caveats for using statistical significance tests in research assessments

    Full text link
    This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We argue that applying statistical significance tests and mechanically adhering to their results is highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to citation indicators, interpretations of them, or the decision making processes based upon them. On the contrary their use may be harmful. Like many other critics, we generally believe that statistical significance tests are over- and misused in the social sciences including scientometrics and we encourage a reform on these matters.Comment: Accepted version for Journal of Informetric

    Borderline vs. unknown: comparing three-valued representations of imperfect information

    Get PDF
    International audienceIn this paper we compare the expressive power of elementary representation formats for vague, incomplete or conflicting information. These include Boolean valuation pairs introduced by Lawry and González-Rodríguez, orthopairs of sets of variables, Boolean possibility and necessity measures, three-valued valuations, supervaluations. We make explicit their connections with strong Kleene logic and with Belnap logic of conflicting information. The formal similarities between 3-valued approaches to vagueness and formalisms that handle incomplete information often lead to a confusion between degrees of truth and degrees of uncertainty. Yet there are important differences that appear at the interpretive level: while truth-functional logics of vagueness are accepted by a part of the scientific community (even if questioned by supervaluationists), the truth-functionality assumption of three-valued calculi for handling incomplete information looks questionable, compared to the non-truth-functional approaches based on Boolean possibility–necessity pairs. This paper aims to clarify the similarities and differences between the two situations. We also study to what extent operations for comparing and merging information items in the form of orthopairs can be expressed by means of operations on valuation pairs, three-valued valuations and underlying possibility distributions

    The Consistency of Arithmetic

    Get PDF
    This paper offers an elementary proof that formal arithmetic is consistent. The system that will be proved consistent is a first-order theory R♯, based as usual on the Peano postulates and the recursion equations for + and ×. However, the reasoning will apply to any axiomatizable extension of R♯ got by adding classical arithmetical truths. Moreover, it will continue to apply through a large range of variation of the un- derlying logic of R♯, while on a simple and straightforward translation, the classical first-order theory P♯ of Peano arithmetic turns out to be an exact subsystem of R♯. Since the reasoning is elementary, it is formalizable within R♯ itself; i.e., we can actually demonstrate within R♯ (or within P♯, if we care) a statement that, in a natural fashion, asserts the consistency of R♯ itself. The reader is unlikely to have missed the significance of the remarks just made. In plain English, this paper repeals Goedel’s famous second theorem. (That’s the one that asserts that sufficiently strong systems are inadequate to demonstrate their own consistency.) That theorem (or at least the significance usually claimed for it) was a mis- take—a subtle and understandable mistake, perhaps, but a mistake nonetheless. Accordingly, this paper reinstates the formal program which is often taken to have been blasted away by Goedel’s theorems— namely, the Hilbert program of demonstrating, by methods that everybody can recognize as effective and finitary, that intuitive mathematics is reliable. Indeed, the present consistency proof for arithmetic will be recognized as correct by anyone who can count to 3. (So much, indeed, for the claim that the reliability of arithmetic rests on transfinite induction up to ε0, and for the incredible mythology that underlies it.

    A Computable Economist’s Perspective on Computational Complexity

    Get PDF
    A computable economist.s view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called "Post's Program of Research for Higher Recursion Theory". Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix.
    • …
    corecore