1,309 research outputs found

    Inductive Definition and Domain Theoretic Properties of Fully Abstract

    Full text link
    A construction of fully abstract typed models for PCF and PCF^+ (i.e., PCF + "parallel conditional function"), respectively, is presented. It is based on general notions of sequential computational strategies and wittingly consistent non-deterministic strategies introduced by the author in the seventies. Although these notions of strategies are old, the definition of the fully abstract models is new, in that it is given level-by-level in the finite type hierarchy. To prove full abstraction and non-dcpo domain theoretic properties of these models, a theory of computational strategies is developed. This is also an alternative and, in a sense, an analogue to the later game strategy semantics approaches of Abramsky, Jagadeesan, and Malacaria; Hyland and Ong; and Nickau. In both cases of PCF and PCF^+ there are definable universal (surjective) functionals from numerical functions to any given type, respectively, which also makes each of these models unique up to isomorphism. Although such models are non-omega-complete and therefore not continuous in the traditional terminology, they are also proved to be sequentially complete (a weakened form of omega-completeness), "naturally" continuous (with respect to existing directed "pointwise", or "natural" lubs) and also "naturally" omega-algebraic and "naturally" bounded complete -- appropriate generalisation of the ordinary notions of domain theory to the case of non-dcpos.Comment: 50 page

    Minimal bounds and members of effectively closed sets

    Full text link
    We show that there exists a non-empty Π10\Pi^0_1 class, with no recursive element, in which no member is a minimal cover for any Turing degree.Comment: 15 pages, 4 figures, 1 acknowledgemen

    Renormalization and Computation II: Time Cut-off and the Halting Problem

    Full text link
    This is the second installment to the project initiated in [Ma3]. In the first Part, I argued that both philosophy and technique of the perturbative renormalization in quantum field theory could be meaningfully transplanted to the theory of computation, and sketched several contexts supporting this view. In this second part, I address some of the issues raised in [Ma3] and provide their development in three contexts: a categorification of the algorithmic computations; time cut--off and Anytime Algorithms; and finally, a Hopf algebra renormalization of the Halting Problem.Comment: 28 page

    Renormalisation and computation II: time cut-off and the Halting Problem

    No full text

    A Direct Version of Veldman's Proof of Open Induction on Cantor Space via Delimited Control Operators

    Get PDF
    First, we reconstruct Wim Veldman's result that Open Induction on Cantor space can be derived from Double-negation Shift and Markov's Principle. In doing this, we notice that one has to use a countable choice axiom in the proof and that Markov's Principle is replaceable by slightly strengthening the Double-negation Shift schema. We show that this strengthened version of Double-negation Shift can nonetheless be derived in a constructive intermediate logic based on delimited control operators, extended with axioms for higher-type Heyting Arithmetic. We formalize the argument and thus obtain a proof term that directly derives Open Induction on Cantor space by the shift and reset delimited control operators of Danvy and Filinski

    On the mathematical and foundational significance of the uncountable

    Full text link
    We study the logical and computational properties of basic theorems of uncountable mathematics, including the Cousin and Lindel\"of lemma published in 1895 and 1903. Historically, these lemmas were among the first formulations of open-cover compactness and the Lindel\"of property, respectively. These notions are of great conceptual importance: the former is commonly viewed as a way of treating uncountable sets like e.g. [0,1][0,1] as 'almost finite', while the latter allows one to treat uncountable sets like e.g. R\mathbb{R} as 'almost countable'. This reduction of the uncountable to the finite/countable turns out to have a considerable logical and computational cost: we show that the aforementioned lemmas, and many related theorems, are extremely hard to prove, while the associated sub-covers are extremely hard to compute. Indeed, in terms of the standard scale (based on comprehension axioms), a proof of these lemmas requires at least the full extent of second-order arithmetic, a system originating from Hilbert-Bernays' Grundlagen der Mathematik. This observation has far-reaching implications for the Grundlagen's spiritual successor, the program of Reverse Mathematics, and the associated G\"odel hierachy. We also show that the Cousin lemma is essential for the development of the gauge integral, a generalisation of the Lebesgue and improper Riemann integrals that also uniquely provides a direct formalisation of Feynman's path integral.Comment: 35 pages with one figure. The content of this version extends the published version in that Sections 3.3.4 and 3.4 below are new. Small corrections/additions have also been made to reflect new development
    • …
    corecore