105 research outputs found

    Kleene chain completeness and fixedpoint properties

    Get PDF

    Fragments and frame classes:Towards a uniform proof theory for modal fixed point logics

    Get PDF
    This thesis studies the proof theory of modal fixed point logics. In particular, we construct proof systems for various fragments of the modal mu-calculus, interpreted over various classes of frames. With an emphasis on uniform constructions and general results, we aim to bring the relatively underdeveloped proof theory of modal fixed point logics closer to the well-established proof theory of basic modal logic. We employ two main approaches. First, we seek to generalise existing methods for basic modal logic to accommodate fragments of the modal mu-calculus. We use this approach for obtaining Hilbert-style proof systems. Secondly, we adapt existing proof systems for the modal mu-calculus to various classes of frames. This approach yields proof systems which are non-well-founded, or cyclic.The thesis starts with an introduction and some mathematical preliminaries. In Chapter 3 we give hypersequent calculi for modal logic with the master modality, building on work by Ori Lahav. This is followed by an Intermezzo, where we present an abstract framework for cyclic proofs, in which we give sufficient conditions for establishing the bounded proof property. In Chapter 4 we generalise existing work on Hilbert-style proof systems for PDL to the level of the continuous modal mu-calculus. Chapter 5 contains a novel cyclic proof system for the alternation-free two-way modal mu-calculus. Finally, in Chapter 6, we present a cyclic proof system for Guarded Kleene Algebra with Tests and take a first step towards using it to establish the completeness of an algebraic counterpart

    An Algebraic Framework for Compositional Program Analysis

    Full text link
    The purpose of a program analysis is to compute an abstract meaning for a program which approximates its dynamic behaviour. A compositional program analysis accomplishes this task with a divide-and-conquer strategy: the meaning of a program is computed by dividing it into sub-programs, computing their meaning, and then combining the results. Compositional program analyses are desirable because they can yield scalable (and easily parallelizable) program analyses. This paper presents algebraic framework for designing, implementing, and proving the correctness of compositional program analyses. A program analysis in our framework defined by an algebraic structure equipped with sequencing, choice, and iteration operations. From the analysis design perspective, a particularly interesting consequence of this is that the meaning of a loop is computed by applying the iteration operator to the loop body. This style of compositional loop analysis can yield interesting ways of computing loop invariants that cannot be defined iteratively. We identify a class of algorithms, the so-called path-expression algorithms [Tarjan1981,Scholz2007], which can be used to efficiently implement analyses in our framework. Lastly, we develop a theory for proving the correctness of an analysis by establishing an approximation relationship between an algebra defining a concrete semantics and an algebra defining an analysis.Comment: 15 page

    Groundedness - Its Logic and Metaphysics

    Get PDF
    In philosophical logic, a certain family of model constructions has received particular attention. Prominent examples are the cumulative hierarchy of well-founded sets, and Kripke's least fixed point models of grounded truth. I develop a general formal theory of groundedness and explain how the well-founded sets, Cantor's extended number-sequence and Kripke's concepts of semantic groundedness are all instances of the general concept, and how the general framework illuminates these cases. Then, I develop a new approach to a grounded theory of proper classes. However, the general concept of groundedness does not account for the philosophical significance of its paradigm instances. Instead, I argue, the philosophical content of the cumulative hierarchy of sets is best understood in terms of a primitive notion of ontological priority. Then, I develop an analogous account of Kripke's models. I show that they exemplify the in-virtue-of relation much discussed in contemporary metaphysics, and thus are philosophically significant. I defend my proposal against a challenge from Kripke's “ghost of the hierarchy”

    Complete Axiomatization for the Bisimilarity Distance on Markov Chains

    Get PDF
    In this paper we propose a complete axiomatization of the bisimilarity distance of Desharnais et al. for the class of finite labelled Markov chains. Our axiomatization is given in the style of a quantitative extension of equational logic recently proposed by Mardare, Panangaden, and Plotkin (LICS\u2716) that uses equality relations t =_e s indexed by rationals, expressing that "t is approximately equal to s up to an error e". Notably, our quantitative deductive system extends in a natural way the equational system for probabilistic bisimilarity given by Stark and Smolka by introducing an axiom for dealing with the Kantorovich distance between probability distributions

    Make flows small again: revisiting the flow framework

    Full text link
    We present a new flow framework for separation logic reasoning about programs that manipulate general graphs. The framework overcomes problems in earlier developments: it is based on standard fixed point theory, guarantees least flows, rules out vanishing flows, and has an easy to understand notion of footprint as needed for soundness of the frame rule. In addition, we present algorithms for automating the frame rule, which we evaluate on graph updates extracted from linearizability proofs for concurrent data structures. The evaluation demonstrates that our algorithms help to automate key aspects of these proofs that have previously relied on user guidance or heuristics

    Groundedness

    Get PDF
    In philosophical logic, a certain family of model constructions has received particular attention. Prominent examples are the cumulative hierarchy of well-founded sets, and Kripke’s least fixed point models of grounded truth. I develop a general formal theory of groundedness and explain how the well-founded sets, Cantor’s extended numbersequence and Kripke’s concepts of semantic groundedness are all instances of the general concept, and how the general framework illuminates these cases. Then, I develop a new approach to a grounded theory of proper classes. However, the general concept of groundedness does not account for the philosophical significance of its paradigm instances. Instead, I argue, the philosophical content of the cumulative hierarchy of sets is best understood in terms of a primitive notion of ontological priority. Then, I develop an analogous account of Kripke’s models. I show that they exemplify the in-virtue-of relation much discussed in contemporary metaphysics, and thus are philosophically significant. I defend my proposal against a challenge from Kripke’s “ghost of the hierarchy”

    A discrete geometric model of concurrent program execution

    Get PDF
    A trace of the execution of a concurrent object-oriented program can be displayed in two-dimensions as a diagram of a non-metric finite geometry. The actions of a programs are represented by points, its objects and threads by vertical lines, its transactions by horizontal lines, its communications and resource sharing by sloping arrows, and its partial traces by rectangular figures. We prove informally that the geometry satisfies the laws of Concurrent Kleene Algebra (CKA); these describe and justify the interleaved implementation of multithreaded programs on computer systems with a lesser number of concurrent processors. More familiar forms of semantics (e.g., verification-oriented and operational) can be derived from CKA. Programs are represented as sets of all their possible traces of execution, and non-determinism is introduced as union of these sets. The geometry is extended to multiple levels of abstraction and granularity; a method call at a higher level can be modelled by a specification of the method body, which is implemented at a lower level. The final section describes how the axioms and definitions of the geometry have been encoded in the interactive proof tool Isabelle, and reports on progress towards automatic checking of the proofs in the paper

    Shoggoth: A Formal Foundation for Strategic Rewriting

    Get PDF
    Rewriting is a versatile and powerful technique used in many domains. Strategic rewriting allows programmers to control the application of rewrite rules by composing individual rewrite rules into complex rewrite strategies. These strategies are semantically complex, as they may be nondeterministic, they may raise errors that trigger backtracking, and they may not terminate.Given such semantic complexity, it is necessary to establish a formal understanding of rewrite strategies and to enable reasoning about them in order to answer questions like: How do we know that a rewrite strategy terminates? How do we know that a rewrite strategy does not fail because we compose two incompatible rewrites? How do we know that a desired property holds after applying a rewrite strategy?In this paper, we introduce Shoggoth: a formal foundation for understanding, analysing and reasoning about strategic rewriting that is capable of answering these questions. We provide a denotational semantics of System S, a core language for strategic rewriting, and prove its equivalence to our big-step operational semantics, which extends existing work by explicitly accounting for divergence. We further define a location-based weakest precondition calculus to enable formal reasoning about rewriting strategies, and we prove this calculus sound with respect to the denotational semantics. We show how this calculus can be used in practice to reason about properties of rewriting strategies, including termination, that they are well-composed, and that desired postconditions hold. The semantics and calculus are formalised in Isabelle/HOL and all proofs are mechanised
    corecore