70 research outputs found

    Topics in String Theory and Quantum Gravity

    Full text link
    These are the lecture notes for the Les Houches Summer School on Quantum Gravity held in July 1992. The notes present some general critical assessment of other (non-string) approaches to quantum gravity, and a selected set of topics concerning what we have learned so far about the subject from string theory. Since these lectures are long (133 A4 pages), we include in this abstract the table of contents, which should help the user of the bulletin board in deciding whether to latex and print the full file. 1-FIELD THEORETICAL APPROACH TO QUANTUM GRAVITY: Linearized gravity; Supergravity; Kaluza-Klein theories; Quantum field theory and classical gravity; Euclidean approach to Quantum Gravity; Canonical quantization of gravity; Gravitational Instantons. 2-CONSISTENCY CONDITIONS: ANOMALIES: Generalities about anomalies; Spinors in 2n dimensions; When can we expect to find anomalies?; The Atiyah-Singer Index Theorem and the computation of anomalies; Examples: Green-Schwarz cancellation mechanism and Witten's SU(2) global anomaly. 3-STRING THEORY I. BOSONIC STRING: Bosonic string; Conformal Field Theory; Quantization of the bosonic string; Interaction in string theory and the characterization of the moduli space; Bosonic strings with background fields. Stringy corrections to Einstein equations; Toroidal compactifications. RR-duality; Operator formalism 4-STRING THEORY II. FERMIONIC STRINGS: Fermionic String; Heterotic String; Strings at finite temperature; Is string theory finite? 5-OTHER DEVELOPMENTS AND CONCLUSIONS: String ``Phenomenology''; Black Holes and Related SubjectsComment: 133 pages, 22 figures (not included, available upon request), LaTe

    Notes in Pure Mathematics & Mathematical Structures in Physics

    Full text link
    These Notes deal with various areas of mathematics, and seek reciprocal combinations, explore mutual relations, ranging from abstract objects to problems in physics.Comment: Small improvements and addition

    Rethinking inconsistent mathematics

    Get PDF
    This dissertation has two main goals. The first is to provide a practice-based analysis of the field of inconsistent mathematics: what motivates it? what role does logic have in it? what distinguishes it from classical mathematics? is it alternative or revolutionary? The second goal is to introduce and defend a new conception of inconsistent mathematics - queer incomaths - as a particularly effective answer to feminist critiques of classical logic and mathematics. This sets the stage for a genuine revolution in mathematics, insofar as it suggests the need for a shift in mainstream attitudes about the rolee of logic and ethics in the practice of mathematics

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog

    Proof-checking mathematical texts in controlled natural language

    Get PDF
    The research conducted for this thesis has been guided by the vision of a computer program that could check the correctness of mathematical proofs written in the language found in mathematical textbooks. Given that reliable processing of unrestricted natural language input is out of the reach of current technology, we focused on the attainable goal of using a controlled natural language (a subset of a natural language defined through a formal grammar) as input language to such a program. We have developed a prototype of such a computer program, the Naproche system. This thesis is centered around the novel logical and linguistic theory needed for defining and motivating the controlled natural language and the proof checking algorithm of the Naproche system. This theory provides means for bridging the wide gap between natural and formal mathematical proofs. We explain how our system makes use of and extends existing linguistic formalisms in order to analyse the peculiarities of the language of mathematics. In this regard, we describe a phenomenon of this language previously not described by other logicians or linguists, the implicit dynamic function introduction, exemplified by constructs of the form "for every x there is an f(x) such that ...". We show how this function introduction can lead to a paradox analogous to Russell's paradox. To tackle this problem, we developed a novel foundational theory of functions called Ackermann-like Function Theory, which is equiconsistent to ZFC (Zermelo-Fraenkel set theory with the Axiom of Choice) and can be used for imposing limitations to implicit dynamic function introduction in order to avoid this paradox. We give a formal account of implicit dynamic function introduction by extending Dynamic Predicate Logic, a formalism developed by linguists to account for the dynamic nature of natural language quantification, to a novel formalism called Higher-Order Dynamic Predicate Logic, whose semantics is based on Ackermann-like Function Theory. Higher-Order Dynamic Predicate Logic also includes a formal account of the linguistic theory of presuppositions, which we use for clarifying and formally modelling the usage of potentially undefined terms (e.g. 1/x, which is undefined for x=0) and of definite descriptions (e.g. "the even prime number") in the language of mathematics. The semantics of the controlled natural language is defined through a translation from the controlled natural language into an extension of Higher-Order Dynamic Predicate Logic called Proof Text Logic. Proof Text Logic extends Higher-Order Dynamic Predicate Logic in two respects, which make it suitable for representing the content of mathematical texts: It contains features for representing complete texts rather than single assertions, and instead of being based on Ackermann-like Function Theory, it is based on a richer foundational theory called Class-Map-Tuple-Number Theory, which does not only have maps/functions, but also classes/sets, tuples, numbers and Booleans as primitives. The proof checking algorithm checks the deductive correctness of proof texts written in the controlled natural language of the Naproche system. Since the semantics of the controlled natural language is defined through a translation into the Proof Text Logic formalism, the proof checking algorithm is defined on Proof Text Logic input. The algorithm makes use of automated theorem provers for checking the correctness of single proof steps. In this way, the proof steps in the input text do not need to be as fine-grained as in formal proof calculi, but may contain several reasoning steps at once, just as is usual in natural mathematical texts. The proof checking algorithm has to recognize implicit dynamic function introductions in the input text and has to take care of presuppositions of mathematical statements according to the principles of the formal account of presuppositions mentioned above. We prove two soundness and two completeness theorems for the proof checking algorithm: In each case one theorem compares the algorithm to the semantics of Proof Text Logic and one theorem compares it to the semantics of standard first-order predicate logic. As a case study for the theory developed in the thesis, we illustrate the working of the Naproche system on a controlled natural language adaptation of the beginning of Edmund Landau's Grundlagen der Analysis.Beweisprüfung mathematischer Texte in kontrollierter natürlicher Sprache Die Forschung, die für diese Dissertation durchgeführt wurde, basiert auf der Vision eines Computerprogramms, das die Korrektheit von mathematischen Beweisen, die in der gewöhnlichen mathematischen Fachsprache verfasst sind, überprüfen kann. Da die zuverlässige automatische Bearbeitung von uneingeschränktem natürlich-sprachlichen Input außer Reichweite der gegenwärtigen Technologie ist, haben wir uns auf das erreichbare Ziel fokussiert, eine kontrollierte natürliche Sprache (eine Teilmenge der natürlichen Sprache, die durch eine formale Grammatik definiert ist) als Eingabesprache für ein solches Programm zu verwenden. Wir haben einen Prototypen eines solchen Computerprogramms, das Naproche-System, entwickelt. Die vorliegende Dissertation beschreibt die neuartigen logischen und linguistischen Theorien, die benötigt werden, um die kontrollierte natürliche Sprache und den Beweisprüfungs-Algorithmus des Naproche-Systems zu definieren und zu motivieren. Diese Theorien stellen Methoden zu Verfügung, die dazu verwendet werden können, die weite Kluft zwischen natürlichen und formalen mathematischen Beweisen zu überbrücken. Wir erklären, wie unser System existierende linguistische Formalismen verwendet und erweitert, um die Besonderheiten der mathematischen Fachsprache zu analysieren. In diesem Zusammenhang beschreiben wir ein Phänomen dieser Fachsprache, das bisher von Logikern und Linguisten nicht beschrieben wurde – die implizite dynamische Funktionseinführung, die durch Konstruktionen der vorm "für jedes x gibt es ein f(x), so dass ..." veranschaulicht werden kann. Wir zeigen, wie diese Funktionseinführung zu einer der Russellschen analogen Antinomie führt. Um dieses Problem zu lösen, haben wir eine neuartige Grundlagentheorie für Funktionen entwickelt, die Ackermann-artige Funktionstheorie, die äquikonsistent zu ZFC (Zermelo-Fraenkel-Mengenlehre mit Auswahlaxiom) ist und verwendet werden kann, um der impliziten dynamischen Funktionseinführung Grenzen zu setzen, die zur Vermeidung dieser Antinomie führen. Wir beschreiben die implizite dynamische Funktionseinführung formal, indem wir die Dynamische Prädikatenlogik – ein Formalismus, der von Linguisten entwickelt wurde, um die dynamischen Eigenschaften der natürlich-sprachlichen Quantifizierung zu erfassen – zur Dynamischen Prädikatenlogik Höherer Stufe erweitern, deren Semantik auf der Ackermann-artigen Funktionstheorie basiert. Die Dynamische Prädikatenlogik Höherer Stufe formalisiert auch die linguistische Theorie der Präsuppositionen, die wir verwenden, um den Gebrauch potentiell undefinierter Terme (z.B. der Term 1/x, der für x=0 undefiniert ist) und bestimmter Kennzeichnungen (z.B. "die gerade Primzahl") in der mathematischen Fachsprache zu modellieren. Die Semantik der kontrollierten natürlichen Sprache wird definiert durch eine Übersetzung dieser in eine Erweiterung der Dynamischen Prädikatenlogik Höherer Stufe mit der Bezeichnung Beweistext-Logik. Die Beweistext-Logik erweitert die Dynamische Prädikatenlogik Höherer Stufe in zwei Hinsichten: Sie stellt Funktionalitäten für die Repräsentation von vollständigen Texten, und nicht nur von Einzelaussagen, zur Verfügung, und anstatt auf der Ackermann-artigen Funktionstheorie zu basieren, basiert sie auf einer reichhaltigeren Grundlagentheorie – der Klassen-Abbildungs-Tupel-Zahlen-Theorie, die neben Abbildungen/Funktionen auch noch Klassen/Mengen, Tupel, Zahlen und boolesche Werte als Grundobjekte zur Verfügung stellt. Der Beweisprüfungs-Algorithmus prüft die deduktive Korrektheit von Beweistexten, die in der kontrollierten natürlichen Sprache des Naproche-Systems verfasst sind. Da die Semantik dieser kontrollierten natürlichen Sprache durch eine Übersetzung in die Beweistext-Logik definiert ist, ist der Beweisprüfungs-Algorithmus für Beweistext-Logik-Input definiert. Der Algorithmus verwendet automatische Beweiser für die Überprüfung einzelner Beweisschritte. Dadurch müssen die Beweisschritte in dem Eingabetext nicht so kleinschrittig sein wie in formalen Beweiskalkülen, sondern können mehrere Deduktionsschritte zu einem Schritt vereinen, so wie dies auch in natürlichen mathematischen Texten üblich ist. Der Beweisprüfungs-Algorithmus muss die impliziten Funktionseinführungen im Eingabetext erkennen und Präsuppositionen von mathematischen Aussagen auf Grundlage der oben erwähnten Präsuppositionstheorie behandeln. Wir beweisen zwei Korrektheits- und zwei Vollständigkeitssätze für den Beweisprüfungs-Algorithmus: Jeweils einer dieser Sätze vergleicht den Algorithmus mit der Semantik der Beweistext-Logik und jeweils einer mit der Semantik der üblichen Prädikatenlogik erster Stufe. Als Fallstudie für die in dieser Dissertation entwickelte Theorie veranschaulichen wir die Funktionsweise des Naproche-Systems an einem an die kontrollierte natürliche Sprache angepassten Anfangsabschnitt von Edmund Landaus Grundlagen der Analysis

    Virtual Mathematics: the logic of difference

    Get PDF
    Of all twentieth century philosophers, it is Gilles Deleuze whose work agitates most forcefully for a worldview privileging becoming over being, difference over sameness; the world as a complex, open set of multiplicities. Nevertheless, Deleuze remains singular in enlisting mathematical resources to underpin and inform such a position, refusing the hackneyed opposition between ‘static’ mathematical logic versus ‘dynamic’ physical world. This is an international collection of work commissioned from foremost philosophers, mathematicians and philosophers of science, to address the wide range of problematics and influences in this most important strand of Deleuze’s thinking. Contributors are Charles Alunni, Alain Badiou, Gilles Châtelet, Manuel DeLanda, Simon Duffy, Robin Durie, Aden Evens, Arkady Plotnitsky, Jean-Michel Salanskis, Daniel Smith and David Webb

    Aspects of emergent cyclicity in language and computation

    Get PDF
    This thesis has four parts, which correspond to the presentation and development of a theoretical framework for the study of cognitive capacities qua physical phenomena, and a case study of locality conditions over natural languages. Part I deals with computational considerations, setting the tone of the rest of the thesis, and introducing and defining critical concepts like ‘grammar’, ‘automaton’, and the relations between them . Fundamental questions concerning the place of formal language theory in linguistic inquiry, as well as the expressibility of linguistic and computational concepts in common terms, are raised in this part. Part II further explores the issues addressed in Part I with particular emphasis on how grammars are implemented by means of automata, and the properties of the formal languages that these automata generate. We will argue against the equation between effective computation and function-based computation, and introduce examples of computable procedures which are nevertheless impossible to capture using traditional function-based theories. The connection with cognition will be made in the light of dynamical frustrations: the irreconciliable tension between mutually incompatible tendencies that hold for a given dynamical system. We will provide arguments in favour of analyzing natural language as emerging from a tension between different systems (essentially, semantics and morpho-phonology) which impose orthogonal requirements over admissible outputs. The concept of level of organization or scale comes to the foreground here; and apparent contradictions and incommensurabilities between concepts and theories are revisited in a new light: that of dynamical nonlinear systems which are fundamentally frustrated. We will also characterize the computational system that emerges from such an architecture: the goal is to get a syntactic component which assigns the simplest possible structural description to sub-strings, in terms of its computational complexity. A system which can oscillate back and forth in the hierarchy of formal languages in assigning structural representations to local domains will be referred to as a computationally mixed system. Part III is where the really fun stuff starts. Field theory is introduced, and its applicability to neurocognitive phenomena is made explicit, with all due scale considerations. Physical and mathematical concepts are permanently interacting as we analyze phrase structure in terms of pseudo-fractals (in Mandelbrot’s sense) and define syntax as a (possibly unary) set of topological operations over completely Hausdorff (CH) ultrametric spaces. These operations, which makes field perturbations interfere, transform that initial completely Hausdorff ultrametric space into a metric, Hausdorff space with a weaker separation axiom. Syntax, in this proposal, is not ‘generative’ in any traditional sense –except the ‘fully explicit theory’ one-: rather, it partitions (technically, ‘parametrizes’) a topological space. Syntactic dependencies are defined as interferences between perturbations over a field, which reduce the total entropy of the system per cycles, at the cost of introducing further dimensions where attractors corresponding to interpretations for a phrase marker can be found. Part IV is a sample of what we can gain by further pursuing the physics of language approach, both in terms of empirical adequacy and theoretical elegance, not to mention the unlimited possibilities of interdisciplinary collaboration. In this section we set our focus on island phenomena as defined by Ross (1967), critically revisiting the most relevant literature on this topic, and establishing a typology of constructions that are strong islands, which cannot be violated. These constructions are particularly interesting because they limit the phase space of what is expressible via natural language, and thus reveal crucial aspects of its underlying dynamics. We will argue that a dynamically frustrated system which is characterized by displaying mixed computational dependencies can provide straightforward characterizations of cyclicity in terms of changes in dependencies in local domains

    Logics of formal inconsistency

    Get PDF
    Orientadores: Walter Alexandre Carnielli, Carlos M. C. L. CaleiroTexto em ingles e portuguesTese (doutorado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias HumanasTese (doutorado) - Universidade Tecnica de Lisboa, Instituto Superior TecnicoResumo: Segundo a pressuposição de consistência clássica, as contradições têm um cará[c]ter explosivo; uma vez que estejam presentes em uma teoria, tudo vale, e nenhum raciocínio sensato pode então ter lugar. Uma lógica é paraconsistente se ela rejeita uma tal pressuposição, e aceita ao invés que algumas teorias inconsistentes conquanto não-triviais façam perfeito sentido. A? Lógicas da Inconsistência Formal, LIFs, formam uma classe de lógicas paraconsistentes particularmente expressivas nas quais a noção meta-teónca de consistência pode ser internalizada ao nível da linguagem obje[c]to. Como consequência, as LIFs são capazes de recapturar o raciocínio consistente pelo acréscimo de assunções de consistência apropriadas. Assim, por exemplo, enquanto regras clássicas tais como o silogismo disjuntivo (de A e {não-,4)-ou-13, infira B) estão fadadas a falhar numa lógica paraconsistente (pois A e (nao-A) poderiam ambas ser verdadeiras para algum A, independentemente de B), elas podem ser recuperadas por uma LIF se o conjunto das premissas for ampliado pela presunção de que estamos raciocinando em um ambiente consistente (neste caso, pelo acréscimo de (consistente-.A) como uma hipótese adicional da regra). A presente monografia introduz as LIFs e apresenta diversas ilustrações destas lógicas e de suas propriedades, mostrando que tais lógicas constituem com efeito a maior parte dos sistemas paraconsistentes da literatura. Diversas formas de se efe[c]tuar a recaptura do raciocínio consistente dentro de tais sistemas inconsistentes são também ilustradas Em cada caso, interpretações em termos de semânticas polivalentes, de traduções possíveis ou modais são fornecidas, e os problemas relacionados à provisão de contrapartidas algébricas para tais lógicas são examinados. Uma abordagem formal abstra[cjta é proposta para todas as definições relacionadas e uma extensa investigação é feita sobre os princípios lógicos e as propriedades positivas e negativas da negação.Abstract: According to the classical consistency presupposition, contradictions have an explosive character: Whenever they are present in a theory, anything goes, and no sensible reasoning can thus take place. A logic is paraconsistent if it disallows such presupposition, and allows instead for some inconsistent yet non-trivial theories to make perfect sense. The Logics of Formal Inconsistency, LFIs, form a particularly expressive class of paraconsistent logics in which the metatheoretical notion of consistency can be internalized at the object-language level. As a consequence, the LFIs are able to recapture consistent reasoning by the addition of appropriate consistency assumptions. So, for instance, while classical rules such as disjunctive syllogism (from A and (not-A)-or-B, infer B) are bound to fail in a paraconsistent logic (because A and (not-.4) could both be true for some A, independently of B), they can be recovered by an LFI if the set of premises is enlarged by the presumption that we are reasoning in a consistent environment (in this case, by the addition of (consistent-/!) as an extra hypothesis of the rule). The present monograph introduces the LFIs and provides several illustrations of them and of their properties, showing that such logics constitute in fact the majority of interesting paraconsistent systems from the literature. Several ways of performing the recapture of consistent reasoning inside such inconsistent systems are also illustrated. In each case, interpretations in terms of many-valued, possible-translations, or modal semantics are provided, and the problems related to providing algebraic counterparts to such logics are surveyed. A formal abstract approach is proposed to all related definitions and an extended investigation is carried out into the logical principles and the positive and negative properties of negation.DoutoradoFilosofiaDoutor em Filosofia e Matemátic
    • …
    corecore