49 research outputs found
A Possible and Necessary Consistency Proof
After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939.
This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.De begränsningar av formella system som uppdagades av Gödels ofullständighetsteorem år 1931 innebär att Peanoaritmetikens konsistens endast kan bevisas med hjälp av fundamentala principer som inte kan formaliseras inom systemet. Trots att Hilberts finitistiska metoder inte kunde producera ett konsistensbevis, så fortsatte sökandet efter ett bevis med konstruktiva metoder. För att ett bevis skall vara meningsfullt borde principerna som används vara mera pålitliga än de element som betvivlas inom teorin.
Avhandlingens titel hänvisar till ett citat av Gentzen då han motiverar behovet av konsistensbevis för första ordningens aritmetik. Gentzen själv producerade fyra konsistensbevis och analyserade hur väl dessa stämde överens med Hilberts program. Gentzen använde konstruktiva metoder i sina bevis, men det debatteras huruvida dessa metoder kan anses vara finitistiska.
Det tredje och mest kända beviset presenterar en reduktion av härledningar av kontradiktioner. Med hjälp av transfinit induktion visas att reduktionsprocessen terminerar i en enkel härledning som konstateras vara omöjlig. Därför kan det inte finnas någon härledning av en kontradiktion.
Avhandlingen undersöker och jämför Gentzens bevis från olika aspekter. Konsistensen av intuitionistisk Heytingaritmetik bevisas både i sekvenskalkyl och i naturlig deduktion. Det tidigare beviset är i Gentzens anda och innehåller ett snittelimineringsbevis för kalkylen och en syntaktisk studie av den aritmetiska delen av systemet. Det senare beviset påminner om ett normaliseringsbevis och visar terminering med hjälp av en vektortilldelning.Gödelin vuonna 1931 jullkaisemista epätäydellisyyslauseista seurausi rajoituksia formaalisille järjestelmille: Niiden mukaan Peano-aritmetiikan ristiriidattomuus voidaan todistaa ainoastaan periaatteilla, jotka eivät ole formalisoitavissa järjestelmän itsensä sisällä. Vaikka Hilbertin finitistisillä menetelmillä ei siksi pystytty tuottamaan konsistenssitodistusta, todistuksen etsiminen jatkui konstruktiivisillä menetelmillä. Jotta todistus olisi mielekäs, siinä käytettyjen periaatteiden oli oltava luotettavampia kuin teorian itsensä sisältämät periaatteet.
Väitöskirjan otsikko viittaa Gentzenin kirjoitukseen, jossa hän perustelee
ensimmäisen kertaluvun aritmetiikan konsistenssitodistuksen tarvetta. Gentzen itse laati neljä sellaista konsistenssitodistusta ja analysoi, missä määrin ne olivat yhdenmukaisia Hilbertin ohjelman kanssa. Gentzen käytti konstruktiivisia
menetelmiä todistuksissaan ja on paljon väitelty kysymys, voidaanko näitä menetelmiä pitää finitistisinä.
Kolmannessa ja tunnetuimassa Gentzenin todistuksessa esitetään ristiriitaisuuksien päättelyn reduktiomenetelmä. Transfiniittistä induktiota käyttämällä osoitetaan, että reduktioprosessi päättyy yksinkertaiseen päättelyyn, jollainen on erikseen todettu mahdottomaksi. Tämän vuoksi ristiriitaa ei voida päätellä.
Väitöskirjassa selvitetään ja vertaillaan Gentzenin todistuksia eri
näkökulmista. Intuitionistisen Heyting-aritmetiikan ristiriidattomuus osoitetaan sekä sekvenssikalkyylissä että luonnollisessa päättelyssä.
Ensimmäinen todistus seuraa Gentzenin henkeä ja siinä sovelletaan ns. leikkaussäänön eliminointitodistusta sekä syntaktista analyysia järjestelmän aritmeettisesta osasta. Jälkimmäinen todistus muistuttaa luonnollisen päättelyn normalisointitodistusta ja näyttää reduktion päättymisen vektorimäärityksen avulla
Explaining Modal Logic Proofs
There has recently been considerable progress in the area of using computers as a tool for theorem proving. In this paper we focus on one facet of human-computer interaction in such systems: generating natural language explanations from proofs. We first discuss the X proof system - a tactic style theorem proving system for first-order logic with a collection of inference rules corresponding to human-oriented proof techniques. In X, proofs are stored as they are discovered using a structured term representation. We describe a method for producing natural language explanations of proofs via a simple mapping algorithm from proof structures to text. Nonclassical or specialized logics are often used in specialized applications. For example, modal logics are often used to reason about time and knowledge, and inheritance theories are often developed for classification systems. The form of, and explanations for, proofs in these systems should be tailored to reflect their special features. In this paper, we focus on the extension of X to incorporate proofs in modal logic, and on the different kinds of explanations of modal proofs that can be produced to meet the needs of different users
The Quantum Strategy of Completeness: On the Self-Foundation of Mathematics
Gentzen’s approach by transfinite induction and that of intuitionist Heyting arithmetic to completeness and the self-foundation of mathematics are compared and opposed to the Gödel incompleteness results as to Peano arithmetic. Quantum mechanics involves infinity by Hilbert space, but it is finitist as any experimental science. The absence of hidden variables in it interpretable as its completeness should resurrect Hilbert’s finitism at the cost of relevant modification of the latter already hinted by intuitionism and Gentzen’s approaches for completeness. This paper investigates both conditions and philosophical background necessary for that modification. The main conclusion is that the concept of infinity as underlying contemporary mathematics cannot be reduced to a single Peano arithmetic, but to at least two ones independent of each other. Intuitionism, quantum mechanics, and Gentzen’s approaches to completeness an even Hilbert’s finitism can be unified from that viewpoint. Mathematics may found itself by a way of finitism complemented by choice. The concept of information as the quantity of choices underlies that viewpoint. Quantum mechanics interpretable in terms of information and quantum information is inseparable from mathematics and its foundation
Neo-Logicism and Its Logic
The rather unrestrained use of second-order logic in the neo-logicist program is critically examined. It is argued in some detail that it brings with it genuine set-theoretical existence assumptions and that the mathematical power that Hume’s Principle seems to provide, in the derivation of Frege’s Theorem, comes largely from the ‘logic’ assumed rather than from Hume’s Principle. It is shown that Hume’s Principle is in reality not stronger than the very weak Robinson Arithmetic Q. Consequently, only a few rudimentary facts of arithmetic are logically derivable from Hume’s Principle. And that hardly counts as a vindication of logicism
Intuitionism and logical revision.
The topic of this thesis is logical revision: should we revise the canons of classical
reasoning in favour of a weaker logic, such as intuitionistic logic? In the first part
of the thesis, I consider two metaphysical arguments against the classical Law of
Excluded Middle-arguments whose main premise is the metaphysical claim that
truth is knowable. I argue that the first argument, the Basic Revisionary Argument,
validates a parallel argument for a conclusion that is unwelcome to classicists
and intuitionists alike: that the dual of the Law of Excluded Middle, the Law of
Non-Contradiction, is either unknown, or both known and not known to be true.
As for the second argument, the Paradox of Knowability, I offer new reasons for
thinking that adopting intuitionistic logic does not go to the heart of the matter.
In the second part of the thesis, I motivate an inferentialist framework for
assessing competing logics-one on which the meaning of the logical vocabulary
is determined by the rules for its correct use. I defend the inferentialist account
of understanding from the contention that it is inadequate in principle, and I
offer reasons for thinking that the inferentialist approach to logic can help model theorists
and proof-theorists alike justify their logical choices. I then scrutinize the
main meaning-theoretic principles on which the inferentialist approach to logic
rests: the requirements of harmony and separability. I show that these principles
are motivated by the assumption that inference rules are complete, and that the
kind of completeness that is necessary for imposing separability is strictly stronger
than the completeness needed for requiring harmony. This allows me to reconcile
the inferentialist assumption that inference rules are complete with the inherent
incompleteness of higher-order logics-an apparent tension that has sometimes
been thought to undermine the entire inferentialist project.
I finally turn to the question whether the inferentialist framework is inhospitable
in principle to classical logical principles. I compare three different regimentations
of classical logic: two old, the multiple-conclusions and the bilateralist
ones, and one new. Each of them satisfies the requirements of harmony and separability,
but each of them also invokes structural principles that are not accepted
by the intuitionist logician. I offer reasons for dismissing multiple-conclusions
and bilateralist formalizations of logic, and I argue that we can nevertheless be
in harmony with classical logic, if we are prepared to adopt classical rules for
disjunction, and if we are willing to treat absurdity as a logical punctuation sign
Aspects of the constructive omega rule within automated deduction
In general, cut elimination holds for arithmetical systems with the w -rule, but not for systems with ordinary induction. Hence in the latter, there is the problem of generalisation, since arbitrary formulae can be cut in. This makes automatic theorem -proving very difficult. An important technique for investigating derivability in formal systems of arithmetic has been to embed such systems into semi- formal systems with the w -rule. This thesis describes the implementation of such a system. Moreover, an important application is presented in the form of a new method of generalisation by means of "guiding proofs" in the stronger system, which sometimes succeeds in producing proofs in the original system when other methods fail
A system of axiomatic set theory - Part VII
The reader of Part VI will have noticed that among the set-theoretic models considered there some models were missing which were announced in Part II for certain proofs of independence. These models will be supplied now. Mainly two models have to be constructed: one with the property that there exists a set which is its own only element, and another in which the axioms I-III and VII, but not Va, are satisfied. In either case we need not satisfy the axiom of infinity. Thereby it becomes possible to set up the models on the basis of only I-III, and either VII or Va, a basis from which number theory can be obtained as we saw in Part II. On both these bases the Π0-system of Part VI, which satisfies the axioms I-V and VII, but not VI, can be constructed, as we stated there. An isomorphic model can also be obtained on that basis, by first setting up number theory as in Part II, and then proceeding as Ackermann did. Let us recall the main points of this procedure. For the sake of clarity in the discussion of this and the subsequent models, it will be necessary to distinguish precisely between the concepts which are relative to the basic set-theoretic system, and those which are relative to the model to be define