58,884 research outputs found

    Query Evaluation in Deductive Databases

    Get PDF
    It is desirable to answer queries posed to deductive databases by computing fixpoints because such computations are directly amenable to set-oriented fact processing. However, the classical fixpoint procedures based on bottom-up processing — the naive and semi-naive methods — are rather primitive and often inefficient. In this article, we rely on bottom-up meta-interpretation for formalizing a new fixpoint procedure that performs a different kind of reasoning: We specify a top-down query answering method, which we call the Backward Fixpoint Procedure. Then, we reconsider query evaluation methods for recursive databases. First, we show that the methods based on rewriting on the one hand, and the methods based on resolution on the other hand, implement the Backward Fixpoint Procedure. Second, we interpret the rewritings of the Alexander and Magic Set methods as specializations of the Backward Fixpoint Procedure. Finally, we argue that such a rewriting is also needed in a database context for implementing efficiently the resolution-based methods. Thus, the methods based on rewriting and the methods based on resolution implement the same top-down evaluation of the original database rules by means of auxiliary rules processed bottom-up

    Query Evaluation in Recursive Databases

    Get PDF

    Preliminary results on Ontology-based Open Data Publishing

    Get PDF
    Despite the current interest in Open Data publishing, a formal and comprehensive methodology supporting an organization in deciding which data to publish and carrying out precise procedures for publishing high-quality data, is still missing. In this paper we argue that the Ontology-based Data Management paradigm can provide a formal basis for a principled approach to publish high quality, semantically annotated Open Data. We describe two main approaches to using an ontology for this endeavor, and then we present some technical results on one of the approaches, called bottom-up, where the specification of the data to be published is given in terms of the sources, and specific techniques allow deriving suitable annotations for interpreting the published data under the light of the ontology

    Constraint Design Rewriting

    Get PDF
    We propose an algebraic approach to the design and transformation of constraint networks, inspired by Architectural Design Rewriting. The approach can be understood as (i) an extension of ADR with constraints, and (ii) an application of ADR to the design of reconfigurable constraint networks. The main idea is to consider classes of constraint networks as algebras whose operators are used to denote constraint networks with terms. Constraint network transformations such as constraint propagations are specified with rewrite rules exploiting the network’s structure provided by terms

    Model Checking Linear Logic Specifications

    Full text link
    The overall goal of this paper is to investigate the theoretical foundations of algorithmic verification techniques for first order linear logic specifications. The fragment of linear logic we consider in this paper is based on the linear logic programming language called LO enriched with universally quantified goal formulas. Although LO was originally introduced as a theoretical foundation for extensions of logic programming languages, it can also be viewed as a very general language to specify a wide range of infinite-state concurrent systems. Our approach is based on the relation between backward reachability and provability highlighted in our previous work on propositional LO programs. Following this line of research, we define here a general framework for the bottom-up evaluation of first order linear logic specifications. The evaluation procedure is based on an effective fixpoint operator working on a symbolic representation of infinite collections of first order linear logic formulas. The theory of well quasi-orderings can be used to provide sufficient conditions for the termination of the evaluation of non trivial fragments of first order linear logic.Comment: 53 pages, 12 figures "Under consideration for publication in Theory and Practice of Logic Programming

    Unified Analysis of Collapsible and Ordered Pushdown Automata via Term Rewriting

    Full text link
    We model collapsible and ordered pushdown systems with term rewriting, by encoding higher-order stacks and multiple stacks into trees. We show a uniform inverse preservation of recognizability result for the resulting class of term rewriting systems, which is obtained by extending the classic saturation-based approach. This result subsumes and unifies similar analyses on collapsible and ordered pushdown systems. Despite the rich literature on inverse preservation of recognizability for term rewrite systems, our result does not seem to follow from any previous study.Comment: in Proc. of FRE

    Linear Compressed Pattern Matching for Polynomial Rewriting (Extended Abstract)

    Full text link
    This paper is an extended abstract of an analysis of term rewriting where the terms in the rewrite rules as well as the term to be rewritten are compressed by a singleton tree grammar (STG). This form of compression is more general than node sharing or representing terms as dags since also partial trees (contexts) can be shared in the compression. In the first part efficient but complex algorithms for detecting applicability of a rewrite rule under STG-compression are constructed and analyzed. The second part applies these results to term rewriting sequences. The main result for submatching is that finding a redex of a left-linear rule can be performed in polynomial time under STG-compression. The main implications for rewriting and (single-position or parallel) rewriting steps are: (i) under STG-compression, n rewriting steps can be performed in nondeterministic polynomial time. (ii) under STG-compression and for left-linear rewrite rules a sequence of n rewriting steps can be performed in polynomial time, and (iii) for compressed rewrite rules where the left hand sides are either DAG-compressed or ground and STG-compressed, and an STG-compressed target term, n rewriting steps can be performed in polynomial time.Comment: In Proceedings TERMGRAPH 2013, arXiv:1302.599
    • …
    corecore