36 research outputs found

    Metalevel and reflexive extension in mechanical theorem proving

    Get PDF
    In spite of many years of research into mechanical assistance for mathematics it is still much more difficult to construct a proof on a machine than on paper. Of course this is partly because, unlike a proof on paper, a machine checked proof must be formal in the strictest sense of that word, but it is also because usually the ways of going about building proofs on a machine are limited compared to what a mathematician is used to. This thesis looks at some possible extensions to the range of tools available on a machine that might lend a user more flexibility in proving theorems, complementing whatever is already available.In particular, it examines what is possible in a framework theorem prover. Such a system, if it is configured to prove theorems in a particular logic T, must have a formal description of the proof theory of T written in the framework theory F of the system. So it should be possible to use whatever facilities are available in F not only to prove theorems of T, but also theorems about T that can then be used in their turn to aid the user in building theorems of T.The thesis is divided into three parts. The first describes the theory FS₀, which has been suggested by Feferman as a candidate for a framework theory suitable for doing meta-theory. The second describes some experiments with FS₀, proving meta-theorems. The third describes an experiment in extending the theory PRA, declared in FS₀, with a reflection facility.More precisely, in the second section three theories are formalised: propositional logic, sorted predicate logic, and the lambda calculus (with a deBruijn style binding). For the first two the deduction theorem and the prenex normal form theorem are respectively proven. For the third, a relational definition of beta-reduction is replaced with an explicit function.In the third section, a method is proposed for avoiding the work involved in building a full Godel style proof predicate for a theory. It is suggested that the language be extended with quotation and substitution facilities directly, instead of providing them as definitional extensions. With this, it is possible to exploit an observation of Solovay's that the Lob derivability conditions are sufficient to capture the schematic behaviour of a proof predicate. Combining this with a reflection schema is enough to produce a non-conservative extension of PRA, and this is demonstrated by some experiments

    Compiling Unit Clauses for the Warren Abstract Machine

    Get PDF
    This thesis describes the design, development, and installation of a computer program which compiles unit clauses generated in a Prolog-based environment at Argonne National Laboratories into Warren Abstract Machine (WAM) code. The program enhances the capabilities of the environment by providing rapid unification and subsumption tests for the very significant class of unit clauses. This should improve performance substantially for large programs that generate and use many unit clauses

    A synthetic axiomatization of Map Theory

    Get PDF
    Includes TOC détaillée, index et appendicesInternational audienceThis paper presents a subtantially simplified axiomatization of Map Theory and proves the consistency of this axiomatization in ZFC under the assumption that there exists an inaccessible ordinal. Map Theory axiomatizes lambda calculus plus Hilbert's epsilon operator. All theorems of ZFC set theory including the axiom of foundation are provable in Map Theory, and if one omits Hilbert's epsilon operator from Map Theory then one is left with a computer programming language. Map Theory fulfills Church's original aim of introducing lambda calculus. Map Theory is suited for reasoning about classical mathematics as well ascomputer programs. Furthermore, Map Theory is suited for eliminating thebarrier between classical mathematics and computer science rather than just supporting the two fields side by side. Map Theory axiomatizes a universe of "maps", some of which are "wellfounded". The class of wellfounded maps in Map Theory corresponds to the universe of sets in ZFC. The first version MT0 of Map Theory had axioms which populated the class of wellfounded maps, much like the power set axiom et.al. populates the universe of ZFC. The new axiomatization MT of Map Theory is "synthetic" in the sense that the class of wellfounded maps is defined inside MapTheory rather than being introduced through axioms. In the paper we define the notion of kappa- and kappasigma-expansions and prove that if sigma is the smallest strongly inaccessible cardinal then canonical kappasigma expansions are models of MT (which proves the consistency). Furthermore, in the appendix, we prove that canonical omega-expansions are fully abstract models of the computational part of Map Theory

    The cognitive development of mathematics - Spatial and timing measures associated with algorithms

    Get PDF
    The thesis considers in turn measures of algorithms, measures of programs, and measures of computations. We define an algorithm's measure as the average of the space time requirements of its associated computations, and here, as with measures for programs, a question of optimisation arises: that of finding the algorithm fora function which has the least such measure. The problem of optimisation for algorithmic measure in its general form, proves to be unsolvable, but we show that an effective optimisation procedure does exist with regard to algorithms for finite functions, and give in detail the solution of the special case for functions with a domain of two elements. Further, we reduce the determination of the optimum algorithm for infinite functions to that of calculating the value of a primitive recursive function for any sufficiently large t. Taking a different viewpoint, we investigate the existence of lower bounds to the measures of algorithms for certain functions. A similar analysis is applied to the spatial measure defined for programs, program length, and we discuss some of the philosophical ramifications of program brevity. In addition, a pseudo-spatial measure, the number of instructions a program contains is considered. By using reduction theorems which map onto one another corresponding instructions in equivalent programs, we are able to adapt to this pseudo-spatial measure our results on program length. We then examine a problem of secondary optimisation, which involves a minimisation of both algorithmic and program measure. Measures of computations have been analysed by Myhill, Trakhtenbrot, Smullyan, Ritchie, Cleave, Rabin, Arbib and Blum. An essential element in Blum's definition of computational measure are the 'measuring predicates' and we investigate certain relations between their spatial and timing. Requirements (as where an upper bound on one such quantity places a lower bound on another). The relevant literature is discussed in brief, and we take up a number of points which arise. Most of the arguments and results in the thesis are formulated in terms of Turing machines, but they are applicable to other means of representing algorithms. In the final chapter we investigate how the definition of Turing machines may be extended so as to provide a more authentic model of actual computers both in regard to space-time measure, and to the domain of functions they encompass. <p

    The design and implementation of a relational programming system.

    Get PDF
    The declarative class of computer languages consists mainly of two paradigms - the logic and the functional. Much research has been devoted in recent years to the integration of the two with the aim of securing the advantages of both without retaining their disadvantages. To date this research has, arguably, been less fruitful than initially hoped. A large number of composite functional/logical languages have been proposed but have generally been marred by the lack of a firm, cohesive, mathematical basis. More recently new declarative paradigms, equational and constraint languages, have been advocated. These however do not fully encompass those features we perceive as being central to functional and logic languages. The crucial functional features are higher-order definitions, static polymorphic typing, applicative expressions and laziness. The crucial logic features are ability to reason about both functional and non-functional relationships and to handle computations involving search. This thesis advocates a new declarative paradigm which lies midway between functional and logic languages - the so-called relational paradigm. In a relationallanguage program and data alike are denoted by relations. All expressions are relations constructed from simpler expressions using operators which form a relational algebra. The impetus for use of relations in a declarative language comes from observations concerning their connection to functional and logic programming. Relations are mathematically more general than functions modelling non-functional as well as functional relationships. They also form the basis of many logic languages, for example, Prolog. This thesis proposes a new relational language based entirely on binary relations, named Drusilla. We demonstrate the functional and logic aspects of Drusilla. It retains the higher-order objects and polymorphism found in modern functional languages but handles non-determinism and models relationships between objects in the manner of a logic language with notion of algorithm being composed of logic and control elements. Different programming styles - functional, logic and relational- are illustrated. However, such expressive power does not come for free; it has associated with it a high cost of implementation. Two main techniques are used in the necessarily complex language interpreter. A type inference system checks programs to ensure they are meaningful and simultaneously performs automatic representation selection for relations. A symbolic manipulation system transforms programs to improve. efficiency of expressions and to increase the number of possible representations for relations while preserving program meaning

    Investigations in Belnap's Logic of Inconsistent and Unknown Information

    Get PDF
    Nuel Belnap schlug 1977 eine vierwertige Logik vor, die -- im Gegensatz zur klassischen Logik -- die Faehigkeit haben sollte, sowohl mit widerspruechlicher als auch mit fehlender Information umzugehen. Diese Logik hat jedoch den Nachteil, dass sie Saetze der Form 'wenn ..., dann ...' nicht ausdruecken kann. Ausgehend von dieser Beobachtung analysieren wir die beiden nichtklassischen Aspekte, Widerspruechlichkeit und fehlende Information, indem wir eine dreiwertige Logik entwickeln, die mit widerspruechlicher Information umgehen kann und eine Modallogik, die mit fehlender Information umgehen kann. Beide Logiken sind nicht monoton. Wir untersuchen Eigenschaften, wie z.B. Kompaktheit, Entscheidbarkeit, Deduktionstheoreme und Berechnungkomplexitaet dieser Logiken. Es stellt sich heraus, dass die dreiwertige Logik, nicht kompakt und ihre Folgerungsmenge im Allgemeinen nicht rekursiv aufzaehlbar ist. Beschraenkt man sich hingegen auf endliche Formelmengen, so ist die Folgerungsmenge rekursiv entscheidbar, liegt in der Klasse Σ2P\Sigma_2^P der polynomiellen Zeithierarchie und ist DIFFP-schwer. Wir geben ein auf semantischen Tableaux basierendes, korrektes und vollstaendiges Berechnungsverfahren fuer endliche Praemissenmengen an. Darueberhinaus untersuchen wir Abschwaechungen der Kompaktheitseigenschaft. Die nichtmonotone auf S5-Modellen basierende Modallogik stellt sich als nicht minder komplex heraus. Auch hier untersuchen wir eine sinnvolle Abschwaechung der Kompaktheitseigenschaft. Desweiteren studieren wir den Zusammenhang zu anderen nichtmonotonen Modallogiken wie Moores autoepistemischer Logik (AEL) und McDermotts NML-2. Wir zeigen, dass unsere Logik zwischen AEL und NML-2 liegt. Schliesslich koppeln wir die entworfene Modallogik mit der dreiwertigen Logik. Die dabei enstehende Logik MKT ist eine Erweiterung des nichtmonotonen Fragments von Belnaps Logik. Wir schliessen unsere Betrachtungen mit einem Vergleich von MKT und verschiedenen informationstheoretischen Logiken, wie z.B. Nelsons N und Heytings intuitionistischer Logik ab

    An Algorithmic Interpretation of Quantum Probability

    Get PDF
    The Everett (or relative-state, or many-worlds) interpretation of quantum mechanics has come under fire for inadequately dealing with the Born rule (the formula for calculating quantum probabilities). Numerous attempts have been made to derive this rule from the perspective of observers within the quantum wavefunction. These are not really analytic proofs, but are rather attempts to derive the Born rule as a synthetic a priori necessity, given the nature of human observers (a fact not fully appreciated even by all of those who have attempted such proofs). I show why existing attempts are unsuccessful or only partly successful, and postulate that Solomonoff's algorithmic approach to the interpretation of probability theory could clarify the problems with these approaches. The Sleeping Beauty probability puzzle is used as a springboard from which to deduce an objectivist, yet synthetic a priori framework for quantum probabilities, that properly frames the role of self-location and self-selection (anthropic) principles in probability theory. I call this framework "algorithmic synthetic unity" (or ASU). I offer no new formal proof of the Born rule, largely because I feel that existing proofs (particularly that of Gleason) are already adequate, and as close to being a formal proof as one should expect or want. Gleason's one unjustified assumption--known as noncontextuality--is, I will argue, completely benign when considered within the algorithmic framework that I propose. I will also argue that, to the extent the Born rule can be derived within ASU, there is no reason to suppose that we could not also derive all the other fundamental postulates of quantum theory, as well. There is nothing special here about the Born rule, and I suggest that a completely successful Born rule proof might only be possible once all the other postulates become part of the derivation. As a start towards this end, I show how we can already derive the essential content of the fundamental postulates of quantum mechanics, at least in outline, and especially if we allow some educated and well-motivated guesswork along the way. The result is some steps towards a coherent and consistent algorithmic interpretation of quantum mechanics

    Belief systems for persuasive discourse planning

    Get PDF
    This thesis is concerned with the problem of construction of the logical structure of a persuasive discourse. A persuasive discourse can be defined as a monodirectional form of communication, generated by a speaker in order to convince a hearer about the validity (or fallacy) of a specific belief The construction of the structure of a persuasive discourse is realized, in this work, through the adoption of two basic elements: a belief system and a planning system. The planning system is used as a tool for the automatic generation of the discourse structure (or plan), obtained through the decomposition of the assigned (communicative) goals of persuasion, aimed at producing specific effects on the hearer’s beliefs. The belief system is adopted in order to endow the planning process with a formal language of beliefs for the representation of such goals, and with the mechanisms which govern the propagation of their (expected) effects on the rest of the hearer's belief state. The main results presented consist of the formalization of a paradigm for specification of belief systems, and of a method — whose correctness is formally proved — for their integration with planning systems. The formalization of a belief system for discourse structure representation (defined in accordance with the theoretical paradigm) is also given, together with the description of its implementation and integration with a specific planner, which resulted in the actual completion of a system for the automatic generation of persuasive discourse plans
    corecore