944 research outputs found

    Knowledge Compilation of Logic Programs Using Approximation Fixpoint Theory

    Full text link
    To appear in Theory and Practice of Logic Programming (TPLP), Proceedings of ICLP 2015 Recent advances in knowledge compilation introduced techniques to compile \emph{positive} logic programs into propositional logic, essentially exploiting the constructive nature of the least fixpoint computation. This approach has several advantages over existing approaches: it maintains logical equivalence, does not require (expensive) loop-breaking preprocessing or the introduction of auxiliary variables, and significantly outperforms existing algorithms. Unfortunately, this technique is limited to \emph{negation-free} programs. In this paper, we show how to extend it to general logic programs under the well-founded semantics. We develop our work in approximation fixpoint theory, an algebraical framework that unifies semantics of different logics. As such, our algebraical results are also applicable to autoepistemic logic, default logic and abstract dialectical frameworks

    Convolution, Separation and Concurrency

    Full text link
    A notion of convolution is presented in the context of formal power series together with lifting constructions characterising algebras of such series, which usually are quantales. A number of examples underpin the universality of these constructions, the most prominent ones being separation logics, where convolution is separating conjunction in an assertion quantale; interval logics, where convolution is the chop operation; and stream interval functions, where convolution is used for analysing the trajectories of dynamical or real-time systems. A Hoare logic is constructed in a generic fashion on the power series quantale, which applies to each of these examples. In many cases, commutative notions of convolution have natural interpretations as concurrency operations.Comment: 39 page

    Sound and complete axiomatizations of coalgebraic language equivalence

    Get PDF
    Coalgebras provide a uniform framework to study dynamical systems, including several types of automata. In this paper, we make use of the coalgebraic view on systems to investigate, in a uniform way, under which conditions calculi that are sound and complete with respect to behavioral equivalence can be extended to a coarser coalgebraic language equivalence, which arises from a generalised powerset construction that determinises coalgebras. We show that soundness and completeness are established by proving that expressions modulo axioms of a calculus form the rational fixpoint of the given type functor. Our main result is that the rational fixpoint of the functor FTFT, where TT is a monad describing the branching of the systems (e.g. non-determinism, weights, probability etc.), has as a quotient the rational fixpoint of the "determinised" type functor Fˉ\bar F, a lifting of FF to the category of TT-algebras. We apply our framework to the concrete example of weighted automata, for which we present a new sound and complete calculus for weighted language equivalence. As a special case, we obtain non-deterministic automata, where we recover Rabinovich's sound and complete calculus for language equivalence.Comment: Corrected version of published journal articl

    Regular Combinators for String Transformations

    Full text link
    We focus on (partial) functions that map input strings to a monoid such as the set of integers with addition and the set of output strings with concatenation. The notion of regularity for such functions has been defined using two-way finite-state transducers, (one-way) cost register automata, and MSO-definable graph transformations. In this paper, we give an algebraic and machine-independent characterization of this class analogous to the definition of regular languages by regular expressions. When the monoid is commutative, we prove that every regular function can be constructed from constant functions using the combinators of choice, split sum, and iterated sum, that are analogs of union, concatenation, and Kleene-*, respectively, but enforce unique (or unambiguous) parsing. Our main result is for the general case of non-commutative monoids, which is of particular interest for capturing regular string-to-string transformations for document processing. We prove that the following additional combinators suffice for constructing all regular functions: (1) the left-additive versions of split sum and iterated sum, which allow transformations such as string reversal; (2) sum of functions, which allows transformations such as copying of strings; and (3) function composition, or alternatively, a new concept of chained sum, which allows output values from adjacent blocks to mix.Comment: This is the full version, with omitted proofs and constructions, of the conference paper currently in submissio

    An Algebraic Framework for Compositional Program Analysis

    Full text link
    The purpose of a program analysis is to compute an abstract meaning for a program which approximates its dynamic behaviour. A compositional program analysis accomplishes this task with a divide-and-conquer strategy: the meaning of a program is computed by dividing it into sub-programs, computing their meaning, and then combining the results. Compositional program analyses are desirable because they can yield scalable (and easily parallelizable) program analyses. This paper presents algebraic framework for designing, implementing, and proving the correctness of compositional program analyses. A program analysis in our framework defined by an algebraic structure equipped with sequencing, choice, and iteration operations. From the analysis design perspective, a particularly interesting consequence of this is that the meaning of a loop is computed by applying the iteration operator to the loop body. This style of compositional loop analysis can yield interesting ways of computing loop invariants that cannot be defined iteratively. We identify a class of algorithms, the so-called path-expression algorithms [Tarjan1981,Scholz2007], which can be used to efficiently implement analyses in our framework. Lastly, we develop a theory for proving the correctness of an analysis by establishing an approximation relationship between an algebra defining a concrete semantics and an algebra defining an analysis.Comment: 15 page
    • …
    corecore