360 research outputs found
Hilbert's Metamathematical Problems and Their Solutions
This dissertation examines several of the problems that Hilbert discovered in the foundations of mathematics, from a metalogical perspective. The problems manifest themselves in four different aspects of Hilbertâs views: (i) Hilbertâs axiomatic approach to the foundations of mathematics; (ii) His response to criticisms of set theory; (iii) His response to intuitionist criticisms of classical mathematics; (iv) Hilbertâs contribution to the specification of the role of logical inference in mathematical reasoning. This dissertation argues that Hilbertâs axiomatic approach was guided primarily by model theoretical concerns. Accordingly, the ultimate aim of his consistency program was to prove the model-theoretical consistency of mathematical theories. It turns out that for the purpose of carrying out such consistency proofs, a suitable modification of the ordinary first-order logic is needed. To effect this modification, independence-friendly logic is needed as the appropriate conceptual framework. It is then shown how the model theoretical consistency of arithmetic can be proved by using IF logic as its basic logic.
Hilbertâs other problems, manifesting themselves as aspects (ii), (iii), and (iv)âmost notably the problem of the status of the axiom of choice, the problem of the role of the law of excluded middle, and the problem of giving an elementary account of quantificationâcan likewise be approached by using the resources of IF logic. It is shown that by means of IF logic one can carry out Hilbertian solutions to all these problems. The two major results concerning aspects (ii), (iii) and (iv) are the following: (a) The axiom of choice is a logical principle; (b) The law of excluded middle divides metamathematical methods into elementary and non-elementary ones. It is argued that these results show that IF logic helps to vindicate Hilbertâs nominalist philosophy of mathematics. On the basis of an elementary approach to logic, which enriches the expressive resources of ordinary first-order logic, this dissertation shows how the different problems that Hilbert discovered in the foundations of mathematics can be solved
Computability in constructive type theory
We give a formalised and machine-checked account of computability theory in the Calculus of Inductive Constructions (CIC), the constructive type theory underlying the Coq proof assistant. We first develop synthetic computability theory, pioneered by Richman, Bridges, and Bauer, where one treats all functions as computable, eliminating the need for a model of computation. We assume a novel parametric axiom for synthetic computability and give proofs of results like Riceâs theorem, the Myhill isomorphism theorem, and the existence of Postâs simple and hypersimple predicates relying on no other axioms such as Markovâs principle or choice axioms. As a second step, we introduce models of computation. We give a concise overview of definitions of various standard models and contribute machine-checked simulation proofs, posing a non-trivial engineering effort. We identify a notion of synthetic undecidability relative to a fixed halting problem, allowing axiom-free machine-checked proofs of undecidability. We contribute such undecidability proofs for the historical foundational problems of computability theory which require the identification of invariants left out in the literature and now form the basis of the Coq Library of Undecidability Proofs. We then identify the weak call-by-value Îť-calculus L as sweet spot for programming in a model of computation. We introduce a certifying extraction framework and analyse an axiom stating that every function of type â â â is L-computable.Wir behandeln eine formalisierte und maschinengeprĂźfte Betrachtung von Berechenbarkeitstheorie im Calculus of Inductive Constructions (CIC), der konstruktiven Typtheorie die dem Beweisassistenten Coq zugrunde liegt. Wir entwickeln erst synthetische Berechenbarkeitstheorie, vorbereitet durch die Arbeit von Richman, Bridges und Bauer, wobei alle Funktionen als berechenbar behandelt werden, ohne Notwendigkeit eines Berechnungsmodells. Wir nehmen ein neues, parametrisches Axiom fĂźr synthetische Berechenbarkeit an und beweisen Resultate wie das Theorem von Rice, das Isomorphismus Theorem von Myhill und die Existenz von Postâs simplen und hypersimplen Prädikaten ohne Annahme von anderen Axiomen wie Markovâs Prinzip oder Auswahlaxiomen. Als zweiten Schritt fĂźhren wir Berechnungsmodelle ein. Wir geben einen kompakten Ăberblick Ăźber die Definition von verschiedenen Berechnungsmodellen und erklären maschinengeprĂźfte Simulationsbeweise zwischen diesen Modellen, welche einen hohen Konstruktionsaufwand beinhalten. Wir identifizieren einen Begriff von synthetischer Unentscheidbarkeit relativ zu einem fixierten Halteproblem welcher axiomenfreie maschinengeprĂźfte Unentscheidbarkeitsbeweise erlaubt. Wir erklären solche Beweise fĂźr die historisch grundlegenden Probleme der Berechenbarkeitstheorie, die das Identifizieren von Invarianten die normalerweise in der Literatur ausgelassen werden benĂśtigen und nun die Basis der Coq Library of Undecidability Proofs bilden. Wir identifizieren dann den call-by-value Îť-KalkĂźl L als sweet spot fĂźr die Programmierung in einem Berechnungsmodell. Wir fĂźhren ein zertifizierendes Extraktionsframework ein und analysieren ein Axiom welches postuliert dass jede Funktion vom Typ NâN L-berechenbar ist
Infinity
This essay surveys the different types of infinity that occur in pure and applied mathematics, with emphasis on: 1. the contrast between potential infinity and actual infinity; 2. Cantor's distinction between transfinite sets and absolute infinity; 3. the constructivist view of infinite quantifiers and the meaning of constructive proof; 4. the concept of feasibility and the philosophical problems surrounding feasible arithmetic; 5. Zeno's paradoxes and modern paradoxes of physical infinity involving supertasks
Recommended from our members
Mathematical Logic: Proof Theory, Constructive Mathematics
[no abstract available
Inductive Theorem Proving Using Refined Unfailing Completion Techniques
We present a brief overview on completion based inductive theorem proving techniques, point out the key concepts for the underlying "proof by consistency" - paradigm and isolate an abstract description of what is necessary for an algorithmic realization of such methods.
In particular, we give several versions of proof orderings, which - under certain conditions - are well-suited for that purpose. Together with corresponding notions of (positive and negative) covering sets we get abstract "positive" and "negative" characterizations of inductive validity. As a consequence we can generalize known criteria for inductive validity, even for the cases where some of the conjectures may not be orientable or where the base system is terminating but not necessarily ground conďŹuent.
Furthermore we consider several reďŹnements and optimizations of completion based inductive theorem proving techniques. In particular, sufďŹcient criteria for being a covering set including restrictions of critical pairs (and the usage of non-equational inductive knowledge) are discussed.
Moreover a couple of lemma generation methods are brieďŹy summarized and classiďŹed. A new techniques of save generalization is particularly interesting, since it provides means for syntactic generalizations, i.e. simpliďŹcations, of conjectures without loosing semantic equivalence.
Finally we present the main features and characteristics of UNICOM, an inductive theorem prover with reďŹned unfailing completion techniques and built on top of TRSPEC, a term rewriting based system for investigating algebraic speciďŹcations
Paul Lorenzen -- Mathematician and Logician
This open access book examines the many contributions of Paul Lorenzen, an outstanding philosopher from the latter half of the 20th century. It features papers focused on integrating Lorenzen's original approach into the history of logic and mathematics. The papers also explore how practitioners can implement Lorenzenâs systematical ideas in todayâs debates on proof-theoretic semantics, databank management, and stochastics. Coverage details key contributions of Lorenzen to constructive mathematics, Lorenzenâs work on lattice-groups and divisibility theory, and modern set theory and Lorenzenâs critique of actual infinity. The contributors also look at the main problem of Grundlagenforschung and Lorenzenâs consistency proof and Hilbertâs larger program. In addition, the papers offer a constructive examination of a Russell-style Ramified Type Theory and a way out of the circularity puzzle within the operative justification of logic and mathematics. Paul Lorenzen's name is associated with the Erlangen School of Methodical Constructivism, of which the approach in linguistic philosophy and philosophy of science determined philosophical discussions especially in Germany in the 1960s and 1970s. This volume features 10 papers from a meeting that took place at the University of Konstanz
Recommended from our members
Mini-Workshop: Feinstrukturtheorie und Innere Modelle
The main aim of fine structure theory and inner model theory can be summarized as the construction of models which have a canonical inner structure (a fine structure), making it possible to analyze them in great detail, and which at the same time reflect important aspects of the surrounding mathematical universe, in that they satisfy certain strong axioms of infinity, or contain complicated sets of reals. Applications range from obtaining lower bounds on the consistency strength of all sorts of set theoretic principles in terms of large cardinals, to proving the consistency of certain combinatorial properties, their compatibility with strong axioms of infinity, or outright proving results in descriptive set theory (for which no proofs avoiding fine structure and inner models are in sight). Fine structure theory and inner model theory has become a sophisticated and powerful apparatus which yields results that are among the deepest in set theory
- âŚ