43 research outputs found

    The weakness of the pigeonhole principle under hyperarithmetical reductions

    Full text link
    The infinite pigeonhole principle for 2-partitions (RT21\mathsf{RT}^1_2) asserts the existence, for every set AA, of an infinite subset of AA or of its complement. In this paper, we study the infinite pigeonhole principle from a computability-theoretic viewpoint. We prove in particular that RT21\mathsf{RT}^1_2 admits strong cone avoidance for arithmetical and hyperarithmetical reductions. We also prove the existence, for every Δn0\Delta^0_n set, of an infinite lown{}_n subset of it or its complement. This answers a question of Wang. For this, we design a new notion of forcing which generalizes the first and second-jump control of Cholak, Jockusch and Slaman.Comment: 29 page

    Towards an arithmetic for partial computable functionals

    Get PDF
    The thesis concerns itself with nonflat Scott information systems as an appropriate denotational semantics for the proposed theory TCF+, a constructive theory of higher-type partial computable functionals and approximations. We prove a definability theorem for type systems with at most unary constructors via atomic-coherent information systems, and give a simple proof for the density property for arbitrary finitary type systems using coherent information systems. We introduce the notions of token matrices and eigen-neighborhoods, and use them to locate normal forms of neighborhoods, as well as to demonstrate that even nonatomic information systems feature implicit atomicity. We then establish connections between coherent information systems and various pointfree structures. Finally, we introduce a fragment of TCF+ and show that extensionality can be eliminated.Diese Dissertation befasst sich mit nichtflachen Scott-Informationssystemen als geeignete denotationelle Semantik für die vorgestellte Theorie TCF+, eine konstruktive Theorie von partiellen berechenbaren Funktionalen und Approximationen in höheren Typen. Auf Basis von atomisch-kohärenten Informationssystemen wird ein Definierbarkeitssatz für Typsysteme mit höchstens einstelligen Konstruktoren gegeben und ein einfacher Beweis des Dichtheitssatzes von beliebigen finitären Typsystemen auf kohärenten Informationssystemen erbracht. Token-Matrizen und Eigenumgebungen werden eingeführt und verwendet, um Normalformen von Umgebungen aufzufinden und um aufzuzeigen, dass auch nichtatomische Informationssysteme über implizite Atomizität verfügen. Im Anschluss werden Verbindungen zwischen kohärenten Informationssystemen und verschiedenen punktfreien Strukturen geknüpft. Schlussendlich wird ein Fragment von TCF+ vorgestellt und gezeigt, dass Extensionalität umgangen werden kann

    Axiomatizations of Compositional Inductive-Recursive Definitions

    Get PDF
    \emph{Induction-recursion} is a definitional principle in Martin-L{\"o}f Type Theory defining families (\U , \T: \U\to D) : \Fam{D} where D: \Setone is an arbitrary fixed (large) set (which in motivating examples is chosen to be D=\Set), and \U : \Set is defined by induction while \T is simultaneously defined by recursion on \U; the qualifier ``simultaneously'' means here that \U may depend on values of the function \T : \U \to D. Two equivalent\footnote{These axiomatizations are equivalent if the underlying logical framework is chosen appropriately.} axiomatizations of this situation were proposed by Dybjer-Setzer in \cite{dybjer99}\cite{dybjer03}. In both cases a (large) set of \emph{codes} \DS\; D\; D (respectively \DS') for inductive-recursive definitions is defined such that each code c : \DS\; D\; D decodes to an endofunctor \sem{c} : \Fam{D}\to \Fam{D} between categories of families whose initial algebra is the family defined by this code cc. The authors proved the consistency of their axiomatizations be giving a set-theoretic model. These axiomatizations \DS and \DS' are however not the only reasonable axiomatizations of induction-recursion. There are at least two ways to come to this conclusion: the more practical one is motivated by the observation that while in the reference theory of inductive definitions it is always possible to compose (in a semantically sound way) two inductive definitions to a single new one, this seems hardly to be the case for the preexisting axiomatizations of Induction-Recursion. The second-, more conceptual observation about the existing axiomatizations of induction-recursion is that it does not contain constructors for dependent products (or powers\footnote{There is a close relationship between dependent products-, and powers of codes.}) of codes but only for dependent sums of codes. Indeed, we show that these two observations are related by characterizing compositionality of Dybjer-Setzer induction-recursion in terms of the existence of powers of codes by sets. Departing from this characterization, we define- and explore two new axiomatizations of induction-recursion satisfying the mentioned characterization and for which we prove compositionality. In the first one, this is achieved by restricting to a subsystem\footnote{By ``subsystem'' we mean here that there is a semantics-preserving translation from \UF to \DS.} \UF of \DS of \emph{uniform codes} for which powers of codes exist. Consistency of this system is established by a semantics-preserving embedding into the system \DS. The second axiomatization \IR of \emph{polynomial codes} (so called since they are based on the idea of iterating polynomials\footnote{Polynomials are also called \emph{containers} and are a formalization for inductive definitions.}) defines a system into which \DS can be embedded and which contains a constructor for dependent products- (and in particular powers) of codes. While for \UF the existence of a model can be obtained by embedding it into \DS for which Dybjer-Setzer themselves devised a (set-theoretic) model, we cannot argue in this way for a model of \IR and instead we provide a new model for the latter having almost the same set-theoretical assumptions concerning large cardinals: while Dybjer-Setzer induction-recursion can be modeled in ZFC supplemented by a Mahlo cardinal and a 00-inaccessible, we need ZFC plus a Mahlo cardinal and a 11-inaccessible. Since the system \IR does not simply arise by adding a constructor for dependent products of codes to \DS caeteris paribus, but additionally requires redefining all other constructors, the question about constructors generating the image of the inclusion \DS\hookrightarrow \IR imposes itself; we approach this question by defining an intermediary system lying between \DS and \IR and give a translation of this intermediary system into \DS. This intermediary system does not only arise by removal of the constructor for dependent products of codes since this did not yet enable us to define a desired translation to \DS, but by introduction of an additional uniformity constraint realized by an annotation of codes by binary trees. A common feature of both new systems \UF and \IR is that they admit a more flexible relation between codes and their subcodes than this is the case for \DS: sub-\DS-codes of a given \DS-code do all have the same type while sub-\UF-, and sub-\IR-codes are not constrained in this way. This latter feature reveals a more abstract way of arriving at the conclusion that Dybjer-Setzer induction-recursion is not the most general formulation of induction-recursion: like inductive definitions can be characterized by a set of operations on sets\footnote{Theses operations are called ``stictly positive operations'' (see \cite{dybjer1996Wtypes}).}, inductive-recursive definitions are also determined by a set of operations on families (namely those represented by the set of functors defined by codes) and Dybjer-Setzer's systems did not take into account operations that change the (large) set DD indexing families while the new systems \UF and \IR do so. In line with the idea of considering induction-recursion as contributing to the pursue of the project of formalizing the meta theory of type theory in type theory itself \cite[p.1]{Dybjer96internaltype}, we return to Dybjer-Setzer's original formalization \DS and provide a \emph{relationally-parametric} model for it that we articulate as a categories-with-families model in the category of reflexive graphs; categories-with-families were proposed in loc.cit. as a formalization of type theory inside type theory that is additionally well adapted to category theoretic reasoning. Relational parametricity is an established and important proof technique to establish meta-theoretic properties of type theories

    A Study of Syntactic and Semantic Artifacts and its Application to Lambda Definability, Strong Normalization, and Weak Normalization in the Presence of...

    Get PDF
    Church's lambda-calculus underlies the syntax (i.e., the form) and the semantics (i.e., the meaning) of functional programs. This thesis is dedicated to studying man-made constructs (i.e., artifacts) in the lambda calculus. For example, one puts the expressive power of the lambda calculus to the test in the area of lambda definability. In this area, we present a course-of-value representation bridging Church numerals and Scott numerals. We then turn to weak and strong normalization using Danvy et al.'s syntactic and functional correspondences. We give a new account of Felleisen and Hieb's syntactic theory of state, and of abstract machines for strong normalization due to Curien, Crégut, Lescanne, and Kluge

    POPLMark reloaded: Mechanizing proofs by logical relations

    Get PDF
    We propose a new collection of benchmark problems in mechanizing the metatheory of programming languages, in order to compare and push the state of the art of proof assistants. In particular, we focus on proofs using logical relations (LRs) and propose establishing strong normalization of a simply typed calculus with a proof by Kripke-style LRs as a benchmark. We give a modern view of this well-understood problem by formulating our LR on well-typed terms. Using this case study, we share some of the lessons learned tackling this problem in different dependently typed proof environments. In particular, we consider the mechanization in Beluga, a proof environment that supports higher-order abstract syntax encodings and contrast it to the development and strategies used in general-purpose proof assistants such as Coq and Agda. The goal of this paper is to engage the community in discussions on what support in proof environments is needed to truly bring mechanized metatheory to the masses and engage said community in the crafting of future benchmarks

    Out of Nowhere: The emergence of spacetime from causal sets

    Get PDF
    This is a chapter of the planned monograph "Out of Nowhere: The Emergence of Spacetime in Quantum Theories of Gravity", co-authored by Nick Huggett and Christian W\"uthrich and under contract with Oxford University Press. (More information at www.beyondspacetime.net.) This chapter sketches how spacetime emerges in causal set theory and demonstrates how this question is deeply entangled with genuinely philosophical concerns.Comment: 32 pages, 5 figure

    Perturbation theory of polynomials and linear operators

    Full text link
    This survey revolves around the question how the roots of a monic polynomial (resp. the spectral decomposition of a linear operator), whose coefficients depend in a smooth way on parameters, depend on those parameters. The parameter dependence of the polynomials (resp. operators) ranges from real analytic over C∞C^\infty to differentiable of finite order with often drastically different regularity results for the roots (resp. eigenvalues and eigenvectors). Another interesting point is the difference between the perturbation theory of hyperbolic polynomials (where, by definition, all roots are real) and that of general complex polynomials. The subject, which started with Rellich's work in the 1930s, enjoyed sustained interest through time that intensified in the last two decades, bringing some definitive optimal results. Throughout we try to explain the main proof ideas; Rellich's theorem and Bronshtein's theorem on hyperbolic polynomials are presented with full proofs. The survey is written for readers interested in singularity theory but also for those who intend to apply the results in other fields.Comment: 65 page
    corecore