2,097 research outputs found

    Variations on the Theme of Conning in Mathematical Economics

    Get PDF
    The mathematization of economics is almost exclusively in terms of the mathematics of real analysis which, in turn, is founded on set theory (and the axiom of choice) and orthodox mathematical logic. In this paper I try to point out that this kind of mathematization is replete with economic infelicities. The attempt to extract these infelicities is in terms of three main examples: dynamics, policy and rational expectations and learning. The focus is on the role and reliance on standard xed point theorems in orthodox mathematical economics

    Sraffa's Mathematical Economics - A Constructive Interpretation

    Get PDF
    The claim in this paper is that Sraffa employed a rigorous logic of mathematical reasoning in his book, Production of Commodities by Means of Commodities (PCC), in such a way that the existence proofs were constructive. This is the kind of mathematics that was prevalent at the beginning of the 19th century, which was dominated by the concrete, the constructive and the algorithmic. It is, therefore, completely consistent with the economics of the 19th century, which was the fulcrum around which the economics of PCC was conceived.Existence Proofs, Constructive Mathematics, Algorithmic Mathematics, Mathematical Economics, Standard System.

    From mathematics in logic to logic in mathematics : Boole and Frege

    Get PDF
    This project proceeds from the premise that the historical and logical value of Boole's logical calculus and its connection with Frege's logic remain to be recognised. It begins by discussing Gillies' application of Kuhn's concepts to the history oflogic and proposing the use of the concept of research programme as a methodological tool in the historiography oflogic. Then it analyses'the development of mathematical logic from Boole to Frege in terms of overlapping research programmes whilst discussing especially Boole's logical calculus. Two streams of development run through the project: 1. A discussion and appraisal of Boole's research programme in the context of logical debates and the emergence of symbolical algebra in Britain in the nineteenth century, including the improvements which Venn brings to logic as algebra, and the axiomatisation of 'Boolean algebras', which is due to Huntington and Sheffer. 2. An investigation of the particularity of the Fregean research programme, including an analysis ofthe extent to which certain elements of Begriffsschrift are new; and an account of Frege's discussion of Boole which focuses on the domain common to the two formal languages and shows the logical connection between Boole's logical calculus and Frege's. As a result, it is shown that the progress made in mathematical logic stemmed from two continuous and overlapping research programmes: Boole's introduction ofmathematics in logic and Frege's introduction oflogic in mathematics. In particular, Boole is regarded as the grandfather of metamathematics, and Lowenheim's theorem ofl915 is seen as a revival of his research programme

    Independence relations in abstract elementary categories

    Get PDF
    In model theory, a branch of mathematical logic, we can classify mathematical structures based on their logical complexity. This yields the so-called stability hierarchy. Independence relations play an important role in this stability hierarchy. An independence relation tells us which subsets of a structure contain information about each other, for example: linear independence in vector spaces yields such a relation. Some important classes in the stability hierarchy are stable, simple and NSOP1, each being contained in the next. For each of these classes there exists a so-called Kim-Pillay style theorem. Such a theorem describes the interaction between independence relations and the stability hierarchy. For example: simplicity is equivalent to admitting a certain independence relation, which must then be unique. All of the above classically takes place in full first-order logic. Parts of it have already been generalised to other frameworks, such as continuous logic, positive logic and even a very general category-theoretic framework. In this thesis we continue this work. We introduce the framework of AECats, which are a specific kind of accessible category. We prove that there can be at most one stable, simple or NSOP1-like independence relation in an AECat. We thus recover (part of) the original stability hierarchy. For this we introduce the notions of long dividing, isi-dividing and long Kim-dividing, which are based on the classical notions of dividing and Kim-dividing but are such that they work well without compactness. Switching frameworks, we generalise Kim-dividing in NSOP1 theories to positive logic. We prove that Kim-dividing over existentially closed models has all the nice properties that it is known to have in full first-order logic. We also provide a full Kim-Pillay style theorem: a positive theory is NSOP1 if and only if there is a nice enough independence relation, which then must be given by Kim-dividing

    The Babelogic of Mathematics

    Get PDF
    How would the Bible written about a Mathematical God start, describing the Creation of Mathematics and Logic? How would Rigveda\u27s Nasadiya sukta read if it were describing the Void before mathematics was born ? Here is an attempt at a partial answer, one which takes the original Genesis chapter and the Nasadiya sukta and makes suitable changes to create a fairly consistent, if somewhat anachronistic narrative (with the slight mixing up of Bertrand Russell and Lobachevsky / Bolyai attributable to Babelogic ), along with a new ending to the Beginning..

    Logic, mathematics, physics: from a loose thread to the close link: Or what gravity is for both logic and mathematics rather than only for physics

    Get PDF
    Gravitation is interpreted to be an “ontomathematical” force or interaction rather than an only physical one. That approach restores Newton’s original design of universal gravitation in the framework of “The Mathematical Principles of Natural Philosophy”, which allows for Einstein’s special and general relativity to be also reinterpreted ontomathematically. The entanglement theory of quantum gravitation is inherently involved also ontomathematically by virtue of the consideration of the qubit Hilbert space after entanglement as the Fourier counterpart of pseudo-Riemannian space. Gravitation can be also interpreted as purely mathematical or logical “force” or “interaction” as a corollary from its ontomathematical (rather than physical) realization. The ontomathematical approach to gravitation is implicit in general relativity equating it to operators in pseudo-Riemannian space obeying the Einstein field equation and also well-known by the “geometrization of physics”. Quantum mechanics shares the same by the separable complex Hilbert space and defining “physical quantity” by the Hermitian operators on it. One can interpret special Minkowski space involved by special relativity and the qubit Hilbert space of quantum information as Fourier counterparts immediately noticing that general relativity means gravitation as the Fourier counterpart of non-Hermitian operators implying non-unitarity and the violation of energy conservation and thus destroying Pauli’s particle paradigm. Since the Standard model obeys it, this explains the impossibility of “quantum gravitation” in any framework conservatively generalizing the Standard model so that it would include gravitation along with electromagnetic, weak, and strong interactions. Einstein’s geometrization of gravitation can be continued into a purely mathematical theory of it following Euclid’s realization for geometry to be exhaustively built in a deductive and axiomatic way as well as Riemann’s parametrization of all the class of Euclidean and non-Euclidean geometries by “space curvature”, then being generalized to Minkowski space as the operators on pseudo-Riemannian space as the Einstein field equation means gravitation. The transition from mathematical gravitation to logical one can rely on the historical lesson of the pair of Lobachevski’s and Riemann’s approaches now “reversely”, i.e., from the latter to the former. Logical gravitation is linkable to Hegel’s dialectical logic and ontological dialectics abandoning their interpretations as a new zero logic substituting classical propionyl logic. The approach of ontomathematics generalizing that of ontology, traceable even to Aristotle’s reformation of Plato’s doctrine, needs Hegel’s doctrine to be formalized as a first-order logic naturally containing Boolean algebra, isomorphic to both classical propositional logic and set theory being the class of all first-order logics, as a sub-logic along with Peano arithmetic as another sub-logic. The first-order logic at issue is called Hilbert arithmetic and elaborated in detail in other papers. It allows for both self-foundation of mathematics to be internally proved as complete and furthermore, quantum mechanics reinterpreted as quantum information to be included by the qubit Hilbert space interpretable in turn as a dual and physical counterpart of Hilbert arithmetic in a narrow sense, that is, both counterparts constitute Hilbert arithmetic in a wide sense, being mathematical and physical simultaneously and thus overcoming the Cartesian dualism of “body” gapped from “mind” by an abyss. Then, the proper philosophical interpretation of gravitation to be the fundamental ontomathematical force or interaction overcomes the ridiculous belief of the Big Bang wrongly alleged to be a scientific theory. Ontomathematical gravitation suggests an omnipresent and omnitemporal medium of “God’s” creation “ex nihilo” following only the natural necessity of quantum-information conservation particularly and locally manifested as energy conservation

    The Major Contribution of Leibniz to Infinitesimal Calculus

    Get PDF
    A study of the work of Leibniz is of importance for at least two reasons. In the first place, Leibniz was not alone among great men in presenting in his early works almost all the important mathematical ideas contained in his mature work, In the second place, the main ideas of his philosophy are to be attributed to his mathematical work, not vice versa. He was perhaps, the earliest to realize fully and correctly the important influence of a calculus on discovery. The almost mechanical operations which one goes through when one is using a calculus enables one to discover facts of mathematics or logic without any of that expenditure of the energy of thought which is so necessary when one is dealing with a department of knowledge that has not yet been reduced to the domain of operation of a calculus. These operations were developed and perfected by Gottfried Wilhelm Leibniz and thus places all mathematicians of today in his debt

    Invariance and Necessity

    Get PDF
    Properties and relations in general have a certain degree of invariance, and some types of properties/relations have a stronger degree of invariance than others. In this paper I will show how the degrees of invariance of different types of properties are associated with, and explain, the modal force of the laws governing them. This explains differences in the modal force of laws/principles of different disciplines, starting with logic and mathematics and proceeding to physics and biology

    Proof-checking mathematical texts in controlled natural language

    Get PDF
    The research conducted for this thesis has been guided by the vision of a computer program that could check the correctness of mathematical proofs written in the language found in mathematical textbooks. Given that reliable processing of unrestricted natural language input is out of the reach of current technology, we focused on the attainable goal of using a controlled natural language (a subset of a natural language defined through a formal grammar) as input language to such a program. We have developed a prototype of such a computer program, the Naproche system. This thesis is centered around the novel logical and linguistic theory needed for defining and motivating the controlled natural language and the proof checking algorithm of the Naproche system. This theory provides means for bridging the wide gap between natural and formal mathematical proofs. We explain how our system makes use of and extends existing linguistic formalisms in order to analyse the peculiarities of the language of mathematics. In this regard, we describe a phenomenon of this language previously not described by other logicians or linguists, the implicit dynamic function introduction, exemplified by constructs of the form "for every x there is an f(x) such that ...". We show how this function introduction can lead to a paradox analogous to Russell's paradox. To tackle this problem, we developed a novel foundational theory of functions called Ackermann-like Function Theory, which is equiconsistent to ZFC (Zermelo-Fraenkel set theory with the Axiom of Choice) and can be used for imposing limitations to implicit dynamic function introduction in order to avoid this paradox. We give a formal account of implicit dynamic function introduction by extending Dynamic Predicate Logic, a formalism developed by linguists to account for the dynamic nature of natural language quantification, to a novel formalism called Higher-Order Dynamic Predicate Logic, whose semantics is based on Ackermann-like Function Theory. Higher-Order Dynamic Predicate Logic also includes a formal account of the linguistic theory of presuppositions, which we use for clarifying and formally modelling the usage of potentially undefined terms (e.g. 1/x, which is undefined for x=0) and of definite descriptions (e.g. "the even prime number") in the language of mathematics. The semantics of the controlled natural language is defined through a translation from the controlled natural language into an extension of Higher-Order Dynamic Predicate Logic called Proof Text Logic. Proof Text Logic extends Higher-Order Dynamic Predicate Logic in two respects, which make it suitable for representing the content of mathematical texts: It contains features for representing complete texts rather than single assertions, and instead of being based on Ackermann-like Function Theory, it is based on a richer foundational theory called Class-Map-Tuple-Number Theory, which does not only have maps/functions, but also classes/sets, tuples, numbers and Booleans as primitives. The proof checking algorithm checks the deductive correctness of proof texts written in the controlled natural language of the Naproche system. Since the semantics of the controlled natural language is defined through a translation into the Proof Text Logic formalism, the proof checking algorithm is defined on Proof Text Logic input. The algorithm makes use of automated theorem provers for checking the correctness of single proof steps. In this way, the proof steps in the input text do not need to be as fine-grained as in formal proof calculi, but may contain several reasoning steps at once, just as is usual in natural mathematical texts. The proof checking algorithm has to recognize implicit dynamic function introductions in the input text and has to take care of presuppositions of mathematical statements according to the principles of the formal account of presuppositions mentioned above. We prove two soundness and two completeness theorems for the proof checking algorithm: In each case one theorem compares the algorithm to the semantics of Proof Text Logic and one theorem compares it to the semantics of standard first-order predicate logic. As a case study for the theory developed in the thesis, we illustrate the working of the Naproche system on a controlled natural language adaptation of the beginning of Edmund Landau's Grundlagen der Analysis.Beweisprüfung mathematischer Texte in kontrollierter natürlicher Sprache Die Forschung, die für diese Dissertation durchgeführt wurde, basiert auf der Vision eines Computerprogramms, das die Korrektheit von mathematischen Beweisen, die in der gewöhnlichen mathematischen Fachsprache verfasst sind, überprüfen kann. Da die zuverlässige automatische Bearbeitung von uneingeschränktem natürlich-sprachlichen Input außer Reichweite der gegenwärtigen Technologie ist, haben wir uns auf das erreichbare Ziel fokussiert, eine kontrollierte natürliche Sprache (eine Teilmenge der natürlichen Sprache, die durch eine formale Grammatik definiert ist) als Eingabesprache für ein solches Programm zu verwenden. Wir haben einen Prototypen eines solchen Computerprogramms, das Naproche-System, entwickelt. Die vorliegende Dissertation beschreibt die neuartigen logischen und linguistischen Theorien, die benötigt werden, um die kontrollierte natürliche Sprache und den Beweisprüfungs-Algorithmus des Naproche-Systems zu definieren und zu motivieren. Diese Theorien stellen Methoden zu Verfügung, die dazu verwendet werden können, die weite Kluft zwischen natürlichen und formalen mathematischen Beweisen zu überbrücken. Wir erklären, wie unser System existierende linguistische Formalismen verwendet und erweitert, um die Besonderheiten der mathematischen Fachsprache zu analysieren. In diesem Zusammenhang beschreiben wir ein Phänomen dieser Fachsprache, das bisher von Logikern und Linguisten nicht beschrieben wurde – die implizite dynamische Funktionseinführung, die durch Konstruktionen der vorm "für jedes x gibt es ein f(x), so dass ..." veranschaulicht werden kann. Wir zeigen, wie diese Funktionseinführung zu einer der Russellschen analogen Antinomie führt. Um dieses Problem zu lösen, haben wir eine neuartige Grundlagentheorie für Funktionen entwickelt, die Ackermann-artige Funktionstheorie, die äquikonsistent zu ZFC (Zermelo-Fraenkel-Mengenlehre mit Auswahlaxiom) ist und verwendet werden kann, um der impliziten dynamischen Funktionseinführung Grenzen zu setzen, die zur Vermeidung dieser Antinomie führen. Wir beschreiben die implizite dynamische Funktionseinführung formal, indem wir die Dynamische Prädikatenlogik – ein Formalismus, der von Linguisten entwickelt wurde, um die dynamischen Eigenschaften der natürlich-sprachlichen Quantifizierung zu erfassen – zur Dynamischen Prädikatenlogik Höherer Stufe erweitern, deren Semantik auf der Ackermann-artigen Funktionstheorie basiert. Die Dynamische Prädikatenlogik Höherer Stufe formalisiert auch die linguistische Theorie der Präsuppositionen, die wir verwenden, um den Gebrauch potentiell undefinierter Terme (z.B. der Term 1/x, der für x=0 undefiniert ist) und bestimmter Kennzeichnungen (z.B. "die gerade Primzahl") in der mathematischen Fachsprache zu modellieren. Die Semantik der kontrollierten natürlichen Sprache wird definiert durch eine Übersetzung dieser in eine Erweiterung der Dynamischen Prädikatenlogik Höherer Stufe mit der Bezeichnung Beweistext-Logik. Die Beweistext-Logik erweitert die Dynamische Prädikatenlogik Höherer Stufe in zwei Hinsichten: Sie stellt Funktionalitäten für die Repräsentation von vollständigen Texten, und nicht nur von Einzelaussagen, zur Verfügung, und anstatt auf der Ackermann-artigen Funktionstheorie zu basieren, basiert sie auf einer reichhaltigeren Grundlagentheorie – der Klassen-Abbildungs-Tupel-Zahlen-Theorie, die neben Abbildungen/Funktionen auch noch Klassen/Mengen, Tupel, Zahlen und boolesche Werte als Grundobjekte zur Verfügung stellt. Der Beweisprüfungs-Algorithmus prüft die deduktive Korrektheit von Beweistexten, die in der kontrollierten natürlichen Sprache des Naproche-Systems verfasst sind. Da die Semantik dieser kontrollierten natürlichen Sprache durch eine Übersetzung in die Beweistext-Logik definiert ist, ist der Beweisprüfungs-Algorithmus für Beweistext-Logik-Input definiert. Der Algorithmus verwendet automatische Beweiser für die Überprüfung einzelner Beweisschritte. Dadurch müssen die Beweisschritte in dem Eingabetext nicht so kleinschrittig sein wie in formalen Beweiskalkülen, sondern können mehrere Deduktionsschritte zu einem Schritt vereinen, so wie dies auch in natürlichen mathematischen Texten üblich ist. Der Beweisprüfungs-Algorithmus muss die impliziten Funktionseinführungen im Eingabetext erkennen und Präsuppositionen von mathematischen Aussagen auf Grundlage der oben erwähnten Präsuppositionstheorie behandeln. Wir beweisen zwei Korrektheits- und zwei Vollständigkeitssätze für den Beweisprüfungs-Algorithmus: Jeweils einer dieser Sätze vergleicht den Algorithmus mit der Semantik der Beweistext-Logik und jeweils einer mit der Semantik der üblichen Prädikatenlogik erster Stufe. Als Fallstudie für die in dieser Dissertation entwickelte Theorie veranschaulichen wir die Funktionsweise des Naproche-Systems an einem an die kontrollierte natürliche Sprache angepassten Anfangsabschnitt von Edmund Landaus Grundlagen der Analysis
    • …
    corecore