802 research outputs found

    Formal Component-Based Semantics

    Get PDF
    One of the proposed solutions for improving the scalability of semantics of programming languages is Component-Based Semantics, introduced by Peter D. Mosses. It is expected that this framework can also be used effectively for modular meta theoretic reasoning. This paper presents a formalization of Component-Based Semantics in the theorem prover Coq. It is based on Modular SOS, a variant of SOS, and makes essential use of dependent types, while profiting from type classes. This formalization constitutes a contribution towards modular meta theoretic formalizations in theorem provers. As a small example, a modular proof of determinism of a mini-language is developed.Comment: In Proceedings SOS 2011, arXiv:1108.279

    A general conservative extension theorem in process algebras with inequalities

    Get PDF
    We prove a general conservative extension theorem for transition system based process theories with easy-to-check and reasonable conditions. The core of this result is another general theorem which gives sufficient conditions for a system of operational rules and an extension of it in order to ensure conservativity, that is, provable transitions from an original term in the extension are the same as in the original system. As a simple corollary of the conservative extension theorem we prove a completeness theorem. We also prove a general theorem giving sufficient conditions to reduce the question of ground confluence modulo some equations for a large term rewriting system associated with an equational process theory to a small term rewriting system under the condition that the large system is a conservative extension of the small one. We provide many applications to show that our results are useful. The applications include (but are not limited to) various real and discrete time settings in ACP, ATP, and CCS and the notions projection, renaming, stage operator, priority, recursion, the silent step, autonomous actions, the empty process, divergence, etc

    A new coinductive confluence proof for infinitary lambda calculus

    Full text link
    We present a new and formal coinductive proof of confluence and normalisation of B\"ohm reduction in infinitary lambda calculus. The proof is simpler than previous proofs of this result. The technique of the proof is new, i.e., it is not merely a coinductive reformulation of any earlier proofs. We formalised the proof in the Coq proof assistant.Comment: arXiv admin note: text overlap with arXiv:1501.0435

    Tight Sum-of-Squares lower bounds for binary polynomial optimization problems

    Get PDF
    We give two results concerning the power of the Sum-of-Squares(SoS)/Lasserre hierarchy. For binary polynomial optimization problems of degree 2d2d and an odd number of variables nn, we prove that n+2d−12\frac{n+2d-1}{2} levels of the SoS/Lasserre hierarchy are necessary to provide the exact optimal value. This matches the recent upper bound result by Sakaue, Takeda, Kim and Ito. Additionally, we study a conjecture by Laurent, who considered the linear representation of a set with no integral points. She showed that the Sherali-Adams hierarchy requires nn levels to detect the empty integer hull, and conjectured that the SoS/Lasserre rank for the same problem is n−1n-1. We disprove this conjecture and derive lower and upper bounds for the rank

    Prototyping Formal System Models with Active Objects

    Full text link
    We propose active object languages as a development tool for formal system models of distributed systems. Additionally to a formalization based on a term rewriting system, we use established Software Engineering concepts, including software product lines and object orientation that come with extensive tool support. We illustrate our modeling approach by prototyping a weak memory model. The resulting executable model is modular and has clear interfaces between communicating participants through object-oriented modeling. Relaxations of the basic memory model are expressed as self-contained variants of a software product line. As a modeling language we use the formal active object language ABS which comes with an extensive tool set. This permits rapid formalization of core ideas, early validity checks in terms of formal invariant proofs, and debugging support by executing test runs. Hence, our approach supports the prototyping of formal system models with early feedback.Comment: In Proceedings ICE 2018, arXiv:1810.0205

    On the mechanisation of the logic of partial functions

    Get PDF
    PhD ThesisIt is well known that partial functions arise frequently in formal reasoning about programs. A partial function may not yield a value for every member of its domain. Terms that apply partial functions thus may not denote, and coping with such terms is problematic in two-valued classical logic. A question is raised: how can reasoning about logical formulae that can contain references to terms that may fail to denote (partial terms) be conducted formally? Over the years a number of approaches to coping with partial terms have been documented. Some of these approaches attempt to stay within the realm of two-valued classical logic, while others are based on non-classical logics. However, as yet there is no consensus on which approach is the best one to use. A comparison of numerous approaches to coping with partial terms is presented based upon formal semantic definitions. One approach to coping with partial terms that has received attention over the years is the Logic of Partial Functions (LPF), which is the logic underlying the Vienna Development Method. LPF is a non-classical three-valued logic designed to cope with partial terms, where both terms and propositions may fail to denote. As opposed to using concrete undfined values, undefinedness is treated as a \gap", that is, the absence of a defined value. LPF is based upon Strong Kleene logic, where the interpretations of the logical operators are extended to cope with truth value \gaps". Over the years a large body of research and engineering has gone into the development of proof based tool support for two-valued classical logic. This has created a major obstacle that affects the adoption of LPF, since such proof support cannot be carried over directly to LPF. Presently, there is a lack of direct proof support for LPF. An aim of this work is to investigate the applicability of mechanised (automated) proof support for reasoning about logical formulae that can contain references to partial terms in LPF. The focus of the investigation is on the basic but fundamental two-valued classical logic proof procedure: resolution and the associated technique proof by contradiction. Advanced proof techniques are built on the foundation that is provided by these basic fundamental proof techniques. Looking at the impact of these basic fundamental proof techniques in LPF is thus the essential and obvious starting point for investigating proof support for LPF. The work highlights the issues that arise when applying these basic techniques in LPF, and investigates the extent of the modifications needed to carry them over to LPF. This work provides the essential foundation on which to facilitate research into the modification of advanced proof techniques for LPF.EPSR

    Reasoning about modular datatypes with Mendler induction

    Full text link
    In functional programming, datatypes a la carte provide a convenient modular representation of recursive datatypes, based on their initial algebra semantics. Unfortunately it is highly challenging to implement this technique in proof assistants that are based on type theory, like Coq. The reason is that it involves type definitions, such as those of type-level fixpoint operators, that are not strictly positive. The known work-around of impredicative encodings is problematic, insofar as it impedes conventional inductive reasoning. Weak induction principles can be used instead, but they considerably complicate proofs. This paper proposes a novel and simpler technique to reason inductively about impredicative encodings, based on Mendler-style induction. This technique involves dispensing with dependent induction, ensuring that datatypes can be lifted to predicates and relying on relational formulations. A case study on proving subject reduction for structural operational semantics illustrates that the approach enables modular proofs, and that these proofs are essentially similar to conventional ones.Comment: In Proceedings FICS 2015, arXiv:1509.0282

    Techniques for Modelling Structured Operational and Denotational Semantics Definitions with Term Rewriting Systems

    Get PDF
    A fundamental requirement for the application of automatic proof support for program verification is that the semantics of programs be appropriately formalized using the object language underlying the proof tool. This means that the semantics definition must not only be stated as syntactically correct input for the proof tool to be used, but also in such a way that the desired proofs can be performed without too many artificial complications. And it must be clear, of course, that the translation from mathematical metalanguage into the object language is correct. The objective of this work is to present methods for the formalization of structured operational and denotational semantics definitions that meet these requirements. It combines techniques known from implementation of the λ\lambda-calculus with a new way to control term rewriting on object level, thus reaching a conceptually simple representation based on unconditional rewriting. This deduction formalism is available within many of the existent proof tools, and therefore application of the representation methods is not restricted to a particular tool. Correctness of the representations is achieved by proving that the non-trivial formalizations yield results that are equivalent to the meta-level definitions in a strong sense. Since the representation algorithms have been implemented in form of executable programs, there is no need to carry out tedious coding schemes by hand. Semantics definitions can be stated in a format very close to the usual meta language format, and they can be transformed automatically into an object-level representation that is accessible to proof tools. The formalizations of the two semantics definition styles are designed in a consistent way, both making use of the same modelling of the underlying mathematical basis. Therefore, they can be used simultaneously in proofs. This is demonstrated in a larger example, where an operational and a denotational semantics definition for a programming language are proved to be equivalent using the Larch Prover. This proof has been carried out by hand before, and so the characteristics of the automated proof can be made quite clear

    More SPASS with Isabelle: superposition with hard sorts and configurable simplification

    Get PDF
    Sledgehammer for Isabelle/HOL integrates automatic theorem provers to discharge interactive proof obligations. This paper considers a tighter integration of the superposition prover SPASS to increase Sledgehammer’s success rate. The main enhancements are native support for hard sorts (simple types) in SPASS, simplification that honors the orientation of Isabelle simp rules, and a pair of clause-selection strategies targeted at large lemma libraries. The usefulness of this integration is confirmed by an evaluation on a vast benchmark suite and by a case study featuring a formalization of language-based security
    • 

    corecore