297 research outputs found

    The Algebra of Logic Tradition

    Get PDF
    The algebra of logic, as an explicit algebraic system showing the underlying mathematical structure of logic, was introduced by George Boole (1815-1864) in his book The Mathematical Analysis of Logic (1847). The methodology initiated by Boole was successfully continued in the 19th century in the work of William Stanley Jevons (1835-1882), Charles Sanders Peirce (1839-1914), Ernst Schröder (1841-1902), among many others, thereby establishing a tradition in (mathematical) logic. From Boole's first book until the influence after WWI of the monumental work Principia Mathematica (1910 1913) by Alfred North Whitehead (1861-1947) and Bertrand Russell (1872-1970), versions of thealgebra of logic were the most developed form of mathematical above allthrough Schröder's three volumes Vorlesungen über die Algebra der Logik(1890-1905). Furthermore, this tradition motivated the investigations of Leopold Löwenheim (1878-1957) that eventually gave rise to model theory. Inaddition, in 1941, Alfred Tarski (1901-1983) in his paper On the calculus of relations returned to Peirce's relation algebra as presented in Schröder's Algebra der Logik. The tradition of the algebra of logic played a key role in thenotion of Logic as Calculus as opposed to the notion of Logic as Universal Language . Beyond Tarski's algebra of relations, the influence of the algebraic tradition in logic can be found in other mathematical theories, such as category theory. However this influence lies outside the scope of this entry, which is divided into 10 sections.Fil: Burris, Stanley. University of Waterloo; CanadáFil: Legris, Javier. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Saavedra 15. Instituto Interdisciplinario de Economía Politica de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Económicas. Instituto Interdisciplinario de Economía Politica de Buenos Aires; Argentin

    Proof-checking mathematical texts in controlled natural language

    Get PDF
    The research conducted for this thesis has been guided by the vision of a computer program that could check the correctness of mathematical proofs written in the language found in mathematical textbooks. Given that reliable processing of unrestricted natural language input is out of the reach of current technology, we focused on the attainable goal of using a controlled natural language (a subset of a natural language defined through a formal grammar) as input language to such a program. We have developed a prototype of such a computer program, the Naproche system. This thesis is centered around the novel logical and linguistic theory needed for defining and motivating the controlled natural language and the proof checking algorithm of the Naproche system. This theory provides means for bridging the wide gap between natural and formal mathematical proofs. We explain how our system makes use of and extends existing linguistic formalisms in order to analyse the peculiarities of the language of mathematics. In this regard, we describe a phenomenon of this language previously not described by other logicians or linguists, the implicit dynamic function introduction, exemplified by constructs of the form "for every x there is an f(x) such that ...". We show how this function introduction can lead to a paradox analogous to Russell's paradox. To tackle this problem, we developed a novel foundational theory of functions called Ackermann-like Function Theory, which is equiconsistent to ZFC (Zermelo-Fraenkel set theory with the Axiom of Choice) and can be used for imposing limitations to implicit dynamic function introduction in order to avoid this paradox. We give a formal account of implicit dynamic function introduction by extending Dynamic Predicate Logic, a formalism developed by linguists to account for the dynamic nature of natural language quantification, to a novel formalism called Higher-Order Dynamic Predicate Logic, whose semantics is based on Ackermann-like Function Theory. Higher-Order Dynamic Predicate Logic also includes a formal account of the linguistic theory of presuppositions, which we use for clarifying and formally modelling the usage of potentially undefined terms (e.g. 1/x, which is undefined for x=0) and of definite descriptions (e.g. "the even prime number") in the language of mathematics. The semantics of the controlled natural language is defined through a translation from the controlled natural language into an extension of Higher-Order Dynamic Predicate Logic called Proof Text Logic. Proof Text Logic extends Higher-Order Dynamic Predicate Logic in two respects, which make it suitable for representing the content of mathematical texts: It contains features for representing complete texts rather than single assertions, and instead of being based on Ackermann-like Function Theory, it is based on a richer foundational theory called Class-Map-Tuple-Number Theory, which does not only have maps/functions, but also classes/sets, tuples, numbers and Booleans as primitives. The proof checking algorithm checks the deductive correctness of proof texts written in the controlled natural language of the Naproche system. Since the semantics of the controlled natural language is defined through a translation into the Proof Text Logic formalism, the proof checking algorithm is defined on Proof Text Logic input. The algorithm makes use of automated theorem provers for checking the correctness of single proof steps. In this way, the proof steps in the input text do not need to be as fine-grained as in formal proof calculi, but may contain several reasoning steps at once, just as is usual in natural mathematical texts. The proof checking algorithm has to recognize implicit dynamic function introductions in the input text and has to take care of presuppositions of mathematical statements according to the principles of the formal account of presuppositions mentioned above. We prove two soundness and two completeness theorems for the proof checking algorithm: In each case one theorem compares the algorithm to the semantics of Proof Text Logic and one theorem compares it to the semantics of standard first-order predicate logic. As a case study for the theory developed in the thesis, we illustrate the working of the Naproche system on a controlled natural language adaptation of the beginning of Edmund Landau's Grundlagen der Analysis.Beweisprüfung mathematischer Texte in kontrollierter natürlicher Sprache Die Forschung, die für diese Dissertation durchgeführt wurde, basiert auf der Vision eines Computerprogramms, das die Korrektheit von mathematischen Beweisen, die in der gewöhnlichen mathematischen Fachsprache verfasst sind, überprüfen kann. Da die zuverlässige automatische Bearbeitung von uneingeschränktem natürlich-sprachlichen Input außer Reichweite der gegenwärtigen Technologie ist, haben wir uns auf das erreichbare Ziel fokussiert, eine kontrollierte natürliche Sprache (eine Teilmenge der natürlichen Sprache, die durch eine formale Grammatik definiert ist) als Eingabesprache für ein solches Programm zu verwenden. Wir haben einen Prototypen eines solchen Computerprogramms, das Naproche-System, entwickelt. Die vorliegende Dissertation beschreibt die neuartigen logischen und linguistischen Theorien, die benötigt werden, um die kontrollierte natürliche Sprache und den Beweisprüfungs-Algorithmus des Naproche-Systems zu definieren und zu motivieren. Diese Theorien stellen Methoden zu Verfügung, die dazu verwendet werden können, die weite Kluft zwischen natürlichen und formalen mathematischen Beweisen zu überbrücken. Wir erklären, wie unser System existierende linguistische Formalismen verwendet und erweitert, um die Besonderheiten der mathematischen Fachsprache zu analysieren. In diesem Zusammenhang beschreiben wir ein Phänomen dieser Fachsprache, das bisher von Logikern und Linguisten nicht beschrieben wurde – die implizite dynamische Funktionseinführung, die durch Konstruktionen der vorm "für jedes x gibt es ein f(x), so dass ..." veranschaulicht werden kann. Wir zeigen, wie diese Funktionseinführung zu einer der Russellschen analogen Antinomie führt. Um dieses Problem zu lösen, haben wir eine neuartige Grundlagentheorie für Funktionen entwickelt, die Ackermann-artige Funktionstheorie, die äquikonsistent zu ZFC (Zermelo-Fraenkel-Mengenlehre mit Auswahlaxiom) ist und verwendet werden kann, um der impliziten dynamischen Funktionseinführung Grenzen zu setzen, die zur Vermeidung dieser Antinomie führen. Wir beschreiben die implizite dynamische Funktionseinführung formal, indem wir die Dynamische Prädikatenlogik – ein Formalismus, der von Linguisten entwickelt wurde, um die dynamischen Eigenschaften der natürlich-sprachlichen Quantifizierung zu erfassen – zur Dynamischen Prädikatenlogik Höherer Stufe erweitern, deren Semantik auf der Ackermann-artigen Funktionstheorie basiert. Die Dynamische Prädikatenlogik Höherer Stufe formalisiert auch die linguistische Theorie der Präsuppositionen, die wir verwenden, um den Gebrauch potentiell undefinierter Terme (z.B. der Term 1/x, der für x=0 undefiniert ist) und bestimmter Kennzeichnungen (z.B. "die gerade Primzahl") in der mathematischen Fachsprache zu modellieren. Die Semantik der kontrollierten natürlichen Sprache wird definiert durch eine Übersetzung dieser in eine Erweiterung der Dynamischen Prädikatenlogik Höherer Stufe mit der Bezeichnung Beweistext-Logik. Die Beweistext-Logik erweitert die Dynamische Prädikatenlogik Höherer Stufe in zwei Hinsichten: Sie stellt Funktionalitäten für die Repräsentation von vollständigen Texten, und nicht nur von Einzelaussagen, zur Verfügung, und anstatt auf der Ackermann-artigen Funktionstheorie zu basieren, basiert sie auf einer reichhaltigeren Grundlagentheorie – der Klassen-Abbildungs-Tupel-Zahlen-Theorie, die neben Abbildungen/Funktionen auch noch Klassen/Mengen, Tupel, Zahlen und boolesche Werte als Grundobjekte zur Verfügung stellt. Der Beweisprüfungs-Algorithmus prüft die deduktive Korrektheit von Beweistexten, die in der kontrollierten natürlichen Sprache des Naproche-Systems verfasst sind. Da die Semantik dieser kontrollierten natürlichen Sprache durch eine Übersetzung in die Beweistext-Logik definiert ist, ist der Beweisprüfungs-Algorithmus für Beweistext-Logik-Input definiert. Der Algorithmus verwendet automatische Beweiser für die Überprüfung einzelner Beweisschritte. Dadurch müssen die Beweisschritte in dem Eingabetext nicht so kleinschrittig sein wie in formalen Beweiskalkülen, sondern können mehrere Deduktionsschritte zu einem Schritt vereinen, so wie dies auch in natürlichen mathematischen Texten üblich ist. Der Beweisprüfungs-Algorithmus muss die impliziten Funktionseinführungen im Eingabetext erkennen und Präsuppositionen von mathematischen Aussagen auf Grundlage der oben erwähnten Präsuppositionstheorie behandeln. Wir beweisen zwei Korrektheits- und zwei Vollständigkeitssätze für den Beweisprüfungs-Algorithmus: Jeweils einer dieser Sätze vergleicht den Algorithmus mit der Semantik der Beweistext-Logik und jeweils einer mit der Semantik der üblichen Prädikatenlogik erster Stufe. Als Fallstudie für die in dieser Dissertation entwickelte Theorie veranschaulichen wir die Funktionsweise des Naproche-Systems an einem an die kontrollierte natürliche Sprache angepassten Anfangsabschnitt von Edmund Landaus Grundlagen der Analysis

    Hilbert's Metamathematical Problems and Their Solutions

    Get PDF
    This dissertation examines several of the problems that Hilbert discovered in the foundations of mathematics, from a metalogical perspective. The problems manifest themselves in four different aspects of Hilbert’s views: (i) Hilbert’s axiomatic approach to the foundations of mathematics; (ii) His response to criticisms of set theory; (iii) His response to intuitionist criticisms of classical mathematics; (iv) Hilbert’s contribution to the specification of the role of logical inference in mathematical reasoning. This dissertation argues that Hilbert’s axiomatic approach was guided primarily by model theoretical concerns. Accordingly, the ultimate aim of his consistency program was to prove the model-theoretical consistency of mathematical theories. It turns out that for the purpose of carrying out such consistency proofs, a suitable modification of the ordinary first-order logic is needed. To effect this modification, independence-friendly logic is needed as the appropriate conceptual framework. It is then shown how the model theoretical consistency of arithmetic can be proved by using IF logic as its basic logic. Hilbert’s other problems, manifesting themselves as aspects (ii), (iii), and (iv)—most notably the problem of the status of the axiom of choice, the problem of the role of the law of excluded middle, and the problem of giving an elementary account of quantification—can likewise be approached by using the resources of IF logic. It is shown that by means of IF logic one can carry out Hilbertian solutions to all these problems. The two major results concerning aspects (ii), (iii) and (iv) are the following: (a) The axiom of choice is a logical principle; (b) The law of excluded middle divides metamathematical methods into elementary and non-elementary ones. It is argued that these results show that IF logic helps to vindicate Hilbert’s nominalist philosophy of mathematics. On the basis of an elementary approach to logic, which enriches the expressive resources of ordinary first-order logic, this dissertation shows how the different problems that Hilbert discovered in the foundations of mathematics can be solved

    Hilbert's epsilon as an Operator of Indefinite Committed Choice

    Get PDF
    Paul Bernays and David Hilbert carefully avoided overspecification of Hilbert's epsilon-operator and axiomatized only what was relevant for their proof-theoretic investigations. Semantically, this left the epsilon-operator underspecified. In the meanwhile, there have been several suggestions for semantics of the epsilon as a choice operator. After reviewing the literature on semantics of Hilbert's epsilon operator, we propose a new semantics with the following features: We avoid overspecification (such as right-uniqueness), but admit indefinite choice, committed choice, and classical logics. Moreover, our semantics for the epsilon supports proof search optimally and is natural in the sense that it does not only mirror some cases of referential interpretation of indefinite articles in natural language, but may also contribute to philosophy of language. Finally, we ask the question whether our epsilon within our free-variable framework can serve as a paradigm useful in the specification and computation of semantics of discourses in natural language.Comment: ii + 73 pages. arXiv admin note: substantial text overlap with arXiv:1104.244

    The Logical Writings of Karl Popper

    Get PDF
    This open access book is the first ever collection of Karl Popper's writings on deductive logic. Karl R. Popper (1902-1994) was one of the most influential philosophers of the 20th century. His philosophy of science ("falsificationism") and his social and political philosophy ("open society") have been widely discussed way beyond academic philosophy. What is not so well known is that Popper also produced a considerable work on the foundations of deductive logic, most of it published at the end of the 1940s as articles at scattered places. This little-known work deserves to be known better, as it is highly significant for modern proof-theoretic semantics. This collection assembles Popper's published writings on deductive logic in a single volume, together with all reviews of these papers. It also contains a large amount of unpublished material from the Popper Archives, including Popper's correspondence related to deductive logic and manuscripts that were (almost) finished, but did not reach the publication stage. All of these items are critically edited with additional comments by the editors. A general introduction puts Popper's work into the context of current discussions on the foundations of logic. This book should be of interest to logicians, philosophers, and anybody concerned with Popper's work

    Ramsification and semantic indeterminacy

    Get PDF
    Is it possible to maintain classical logic, stay close to classical semantics, and yet accept that language might be semantically indeterminate? The article gives an affirmative answer by Ramsifying classical semantics, which yields a new semantic theory that remains much closer to classical semantics than supervaluationism but which at the same time avoids the problematic classical presupposition of semantic determinacy. The resulting Ramsey semantics is developed in detail, it is shown to supply a classical concept of truth and to fully support the rules and metarules of classical logic, and it is applied to vague terms as well as to theoretical or open-ended terms from mathematics and science. The theory also demonstrates how diachronic or synchronic interpretational continuity across languages is compatible with semantic indeterminacy

    Boundary Algebra: A Simpler Approach to Boolean Algebra and the Sentential Connectives

    Get PDF
    Boundary algebra [BA] is a algebra of type , and a simplified notation for Spencer-Brown’s (1969) primary algebra. The syntax of the primary arithmetic [PA] consists of two atoms, () and the blank page, concatenation, and enclosure between ‘(‘ and ‘)’, denoting the primitive notion of distinction. Inserting letters denoting, indifferently, the presence or absence of () into a PA formula yields a BA formula. The BA axioms are A1: ()()= (), and A2: “(()) [abbreviated ‘⊥’] may be written or erased at will,” implying (⊥)=(). The repeated application of A1 and A2 simplifies any PA formula to either () or ⊥. The basis for BA is B1: abc=bca (concatenation commutes & associates); B2, ⊥a=a (BA has a lower bound, ⊥); B3, (a)a=() (BA is a complemented lattice); and B4, (ba)a=(b)a (implies that BA is a distributive lattice). BA has two intended models: (1) the Boolean algebra 2 with base set B={(),⊥}, such that () ⇔ 1 [dually 0], (a) ⇔ a′, and ab ⇔ a∪b [a∩b]; and (2) sentential logic, such that () ⇔ true [false], (a) ⇔ ~a, and ab ⇔ a∨b [a∧b]. BA is a self-dual notation, facilitates a calculational style of proof, and simplifies clausal reasoning and Quine’s truth value analysis. BA resembles C.S. Peirce’s graphical logic, the symbolic logics of Leibniz and W.E. Johnson, the 2 notation of Byrne (1946), and the Boolean term schemata of Quine (1982).Boundary algebra; boundary logic; primary algebra; primary arithmetic; Boolean algebra; calculation proof; G. Spencer-Brown; C.S. Peirce; existential graphs
    corecore