92 research outputs found

    From Nonstandard Analysis to various flavours of Computability Theory

    Full text link
    As suggested by the title, it has recently become clear that theorems of Nonstandard Analysis (NSA) give rise to theorems in computability theory (no longer involving NSA). Now, the aforementioned discipline divides into classical and higher-order computability theory, where the former (resp. the latter) sub-discipline deals with objects of type zero and one (resp. of all types). The aforementioned results regarding NSA deal exclusively with the higher-order case; we show in this paper that theorems of NSA also give rise to theorems in classical computability theory by considering so-called textbook proofs.Comment: To appear in the proceedings of TAMC2017 (http://tamc2017.unibe.ch/

    Grilliot's trick in Nonstandard Analysis

    Full text link
    The technique known as Grilliot's trick constitutes a template for explicitly defining the Turing jump functional (2)(\exists^2) in terms of a given effectively discontinuous type two functional. In this paper, we discuss the standard extensionality trick: a technique similar to Grilliot's trick in Nonstandard Analysis. This nonstandard trick proceeds by deriving from the existence of certain nonstandard discontinuous functionals, the Transfer principle from Nonstandard analysis limited to Π10\Pi_1^0-formulas; from this (generally ineffective) implication, we obtain an effective implication expressing the Turing jump functional in terms of a discontinuous functional (and no longer involving Nonstandard Analysis). The advantage of our nonstandard approach is that one obtains effective content without paying attention to effective content. We also discuss a new class of functionals which all seem to fall outside the established categories. These functionals directly derive from the Standard Part axiom of Nonstandard Analysis.Comment: 21 page

    An algorithmic approach to the existence of ideal objects in commutative algebra

    Full text link
    The existence of ideal objects, such as maximal ideals in nonzero rings, plays a crucial role in commutative algebra. These are typically justified using Zorn's lemma, and thus pose a challenge from a computational point of view. Giving a constructive meaning to ideal objects is a problem which dates back to Hilbert's program, and today is still a central theme in the area of dynamical algebra, which focuses on the elimination of ideal objects via syntactic methods. In this paper, we take an alternative approach based on Kreisel's no counterexample interpretation and sequential algorithms. We first give a computational interpretation to an abstract maximality principle in the countable setting via an intuitive, state based algorithm. We then carry out a concrete case study, in which we give an algorithmic account of the result that in any commutative ring, the intersection of all prime ideals is contained in its nilradical

    On the mathematical and foundational significance of the uncountable

    Full text link
    We study the logical and computational properties of basic theorems of uncountable mathematics, including the Cousin and Lindel\"of lemma published in 1895 and 1903. Historically, these lemmas were among the first formulations of open-cover compactness and the Lindel\"of property, respectively. These notions are of great conceptual importance: the former is commonly viewed as a way of treating uncountable sets like e.g. [0,1][0,1] as 'almost finite', while the latter allows one to treat uncountable sets like e.g. R\mathbb{R} as 'almost countable'. This reduction of the uncountable to the finite/countable turns out to have a considerable logical and computational cost: we show that the aforementioned lemmas, and many related theorems, are extremely hard to prove, while the associated sub-covers are extremely hard to compute. Indeed, in terms of the standard scale (based on comprehension axioms), a proof of these lemmas requires at least the full extent of second-order arithmetic, a system originating from Hilbert-Bernays' Grundlagen der Mathematik. This observation has far-reaching implications for the Grundlagen's spiritual successor, the program of Reverse Mathematics, and the associated G\"odel hierachy. We also show that the Cousin lemma is essential for the development of the gauge integral, a generalisation of the Lebesgue and improper Riemann integrals that also uniquely provides a direct formalisation of Feynman's path integral.Comment: 35 pages with one figure. The content of this version extends the published version in that Sections 3.3.4 and 3.4 below are new. Small corrections/additions have also been made to reflect new development
    corecore