300 research outputs found

    Variable types for meaning assembly: a logical syntax for generic noun phrases introduced by most

    Get PDF
    This paper proposes a way to compute the meanings associated with sentences with generic noun phrases corresponding to the generalized quantifier most. We call these generics specimens and they resemble stereotypes or prototypes in lexical semantics. The meanings are viewed as logical formulae that can thereafter be interpreted in your favourite models. To do so, we depart significantly from the dominant Fregean view with a single untyped universe. Indeed, our proposal adopts type theory with some hints from Hilbert \epsilon-calculus (Hilbert, 1922; Avigad and Zach, 2008) and from medieval philosophy, see e.g. de Libera (1993, 1996). Our type theoretic analysis bears some resemblance with ongoing work in lexical semantics (Asher 2011; Bassac et al. 2010; Moot, Pr\'evot and Retor\'e 2011). Our model also applies to classical examples involving a class, or a generic element of this class, which is not uttered but provided by the context. An outcome of this study is that, in the minimalism-contextualism debate, see Conrad (2011), if one adopts a type theoretical view, terms encode the purely semantic meaning component while their typing is pragmatically determined

    A Framework for Program Development Based on Schematic Proof

    Get PDF
    Often, calculi for manipulating and reasoning about programs can be recast as calculi for synthesizing programs. The difference involves often only a slight shift of perspective: admitting metavariables into proofs. We propose that such calculi should be implemented in logical frameworks that support this kind of proof construction and that such an implementation can unify program verification and synthesis. Our proposal is illustrated with a worked example developed in Paulson's Isabelle system. We also give examples of existent calculi that are closely related to the methodology we are proposing and others that can be profitably recast using our approach

    Confluence of CHR Revisited:Invariants and Modulo Equivalence

    Get PDF
    Abstract simulation of one transition system by another is introduced as a means to simulate a potentially infinite class of similar transition sequences within a single transition sequence. This is useful for proving confluence under invariants of a given system, as it may reduce the number of proof cases to consider from infinity to a finite number. The classical confluence results for Constraint Handling Rules (CHR) can be explained in this way, using CHR as a simulation of itself. Using an abstract simulation based on a ground representation, we extend these results to include confluence under invariant and modulo equivalence, which have not been done in a satisfactory way before.Comment: Pre-proceedings paper presented at the 28th International Symposium on Logic-Based Program Synthesis and Transformation (LOPSTR 2018), Frankfurt am Main, Germany, 4-6 September 2018 (arXiv:1808.03326

    Mechanizing Principia Logico-Metaphysica in Functional Type Theory

    Full text link
    Principia Logico-Metaphysica contains a foundational logical theory for metaphysics, mathematics, and the sciences. It includes a canonical development of Abstract Object Theory [AOT], a metaphysical theory (inspired by ideas of Ernst Mally, formalized by Zalta) that distinguishes between ordinary and abstract objects. This article reports on recent work in which AOT has been successfully represented and partly automated in the proof assistant system Isabelle/HOL. Initial experiments within this framework reveal a crucial but overlooked fact: a deeply-rooted and known paradox is reintroduced in AOT when the logic of complex terms is simply adjoined to AOT's specially-formulated comprehension principle for relations. This result constitutes a new and important paradox, given how much expressive and analytic power is contributed by having the two kinds of complex terms in the system. Its discovery is the highlight of our joint project and provides strong evidence for a new kind of scientific practice in philosophy, namely, computational metaphysics. Our results were made technically possible by a suitable adaptation of Benzm\"uller's metalogical approach to universal reasoning by semantically embedding theories in classical higher-order logic. This approach enables one to reuse state-of-the-art higher-order proof assistants, such as Isabelle/HOL, for mechanizing and experimentally exploring challenging logics and theories such as AOT. Our results also provide a fresh perspective on the question of whether relational type theory or functional type theory better serves as a foundation for logic and metaphysics.Comment: 14 pages, 6 figures; preprint of article with same title to appear in The Review of Symbolic Logi

    Doctor of Philosophy

    Get PDF
    dissertationTrusted computing base (TCB) of a computer system comprises components that must be trusted in order to support its security policy. Research communities have identified the well-known minimal TCB principle, namely, the TCB of a system should be as small as possible, so that it can be thoroughly examined and verified. This dissertation is an experiment showing how small the TCB for an isolation service is based on software fault isolation (SFI) for small multitasking embedded systems. The TCB achieved by this dissertation includes just the formal definitions of isolation properties, instruction semantics, program logic, and a proof assistant, besides hardware. There is not a compiler, an assembler, a verifier, a rewriter, or an operating system in the TCB. To the best of my knowledge, this is the smallest TCB that has ever been shown for guaranteeing nontrivial properties of real binary programs on real hardware. This is accomplished by combining SFI techniques and high-confidence formal verification. An SFI implementation inserts dynamic checks before dangerous operations, and these checks provide necessary invariants needed by the formal verification to prove theorems about the isolation properties of ARM binary programs. The high-confidence assurance of the formal verification comes from two facts. First, the verification is based on an existing realistic semantics of the ARM ISA that is independently developed by Cambridge researchers. Second, the verification is conducted in a higher-order proof assistant-the HOL theorem prover, which mechanically checks every verification step by rigorous logic. In addition, the entire verification process, including both specification generation and verification, is automatic. To support proof automation, a novel program logic has been designed, and an automatic reasoning framework for verifying shallow safety properties has been developed. The program logic integrates Hoare-style reasoning and Floyd's inductive assertion reasoning together in a small set of definitions, which overcomes shortcomings of Hoare logic and facilitates proof automation. All inference rules of the logic are proven based on the instruction semantics and the logic definitions. The framework leverages abstract interpretation to automatically find function specifications required by the program logic. The results of the abstract interpretation are used to construct the function specifications automatically, and the specifications are proven without human interaction by utilizing intermediate theorems generated during the abstract interpretation. All these work in concert to create the very small TCB

    Second-Order Algebraic Theories

    Full text link
    Fiore and Hur recently introduced a conservative extension of universal algebra and equational logic from first to second order. Second-order universal algebra and second-order equational logic respectively provide a model theory and a formal deductive system for languages with variable binding and parameterised metavariables. This work completes the foundations of the subject from the viewpoint of categorical algebra. Specifically, the paper introduces the notion of second-order algebraic theory and develops its basic theory. Two categorical equivalences are established: at the syntactic level, that of second-order equational presentations and second-order algebraic theories; at the semantic level, that of second-order algebras and second-order functorial models. Our development includes a mathematical definition of syntactic translation between second-order equational presentations. This gives the first formalisation of notions such as encodings and transforms in the context of languages with variable binding

    In Praise of Impredicativity: A Contribution to the Formalization of Meta-Programming

    Get PDF
    Processing programs as data is one of the successes of functional and logic programming. Higher-order functions, as program-processing programs are called in functional programming, and meta-programs, as they are called in logic programming, are widespread declarative programming techniques. In logic programming, there is a gap between the meta-programming practice and its theory: The formalizations of meta-programming do not explicitly address its impredicativity and are not fully adequate. This article aims at overcoming this unsatisfactory situation by discussing the relevance of impredicativity to meta-programming, by revisiting former formalizations of meta-programming, and by defining Reflective Predicate Logic, a conservative extension of first-order logic, which provides a simple formalization of meta-programming
    corecore