37,174 research outputs found

    Upside-down Deduction

    Get PDF
    Over the recent years, several proposals were made to enhance database systems with automated reasoning. In this article we analyze two such enhancements based on meta-interpretation. We consider on the one hand the theorem prover Satchmo, on the other hand the Alexander and Magic Set methods. Although they achieve different goals and are based on distinct reasoning paradigms, Satchmo and the Alexander or Magic Set methods can be similarly described by upside-down meta-interpreters, i.e., meta-interpreters implementing one reasoning principle in terms of the other. Upside-down meta-interpretation gives rise to simple and efficient implementations, but has not been investigated in the past. This article is devoted to studying this technique. We show that it permits one to inherit a search strategy from an inference engine, instead of implementing it, and to combine bottom-up and top-down reasoning. These properties yield an explanation for the efficiency of Satchmo and a justification for the unconventional approach to top-down reasoning of the Alexander and Magic Set methods

    Query Evaluation in Deductive Databases

    Get PDF
    It is desirable to answer queries posed to deductive databases by computing fixpoints because such computations are directly amenable to set-oriented fact processing. However, the classical fixpoint procedures based on bottom-up processing — the naive and semi-naive methods — are rather primitive and often inefficient. In this article, we rely on bottom-up meta-interpretation for formalizing a new fixpoint procedure that performs a different kind of reasoning: We specify a top-down query answering method, which we call the Backward Fixpoint Procedure. Then, we reconsider query evaluation methods for recursive databases. First, we show that the methods based on rewriting on the one hand, and the methods based on resolution on the other hand, implement the Backward Fixpoint Procedure. Second, we interpret the rewritings of the Alexander and Magic Set methods as specializations of the Backward Fixpoint Procedure. Finally, we argue that such a rewriting is also needed in a database context for implementing efficiently the resolution-based methods. Thus, the methods based on rewriting and the methods based on resolution implement the same top-down evaluation of the original database rules by means of auxiliary rules processed bottom-up

    Query Evaluation in Recursive Databases

    Get PDF

    Towards Intelligent Databases

    Get PDF
    This article is a presentation of the objectives and techniques of deductive databases. The deductive approach to databases aims at extending with intensional definitions other database paradigms that describe applications extensionaUy. We first show how constructive specifications can be expressed with deduction rules, and how normative conditions can be defined using integrity constraints. We outline the principles of bottom-up and top-down query answering procedures and present the techniques used for integrity checking. We then argue that it is often desirable to manage with a database system not only database applications, but also specifications of system components. We present such meta-level specifications and discuss their advantages over conventional approaches

    A Universal Machine for Biform Theory Graphs

    Full text link
    Broadly speaking, there are two kinds of semantics-aware assistant systems for mathematics: proof assistants express the semantic in logic and emphasize deduction, and computer algebra systems express the semantics in programming languages and emphasize computation. Combining the complementary strengths of both approaches while mending their complementary weaknesses has been an important goal of the mechanized mathematics community for some time. We pick up on the idea of biform theories and interpret it in the MMTt/OMDoc framework which introduced the foundations-as-theories approach, and can thus represent both logics and programming languages as theories. This yields a formal, modular framework of biform theory graphs which mixes specifications and implementations sharing the module system and typing information. We present automated knowledge management work flows that interface to existing specification/programming tools and enable an OpenMath Machine, that operationalizes biform theories, evaluating expressions by exhaustively applying the implementations of the respective operators. We evaluate the new biform framework by adding implementations to the OpenMath standard content dictionaries.Comment: Conferences on Intelligent Computer Mathematics, CICM 2013 The final publication is available at http://link.springer.com

    A Framework for Program Development Based on Schematic Proof

    Get PDF
    Often, calculi for manipulating and reasoning about programs can be recast as calculi for synthesizing programs. The difference involves often only a slight shift of perspective: admitting metavariables into proofs. We propose that such calculi should be implemented in logical frameworks that support this kind of proof construction and that such an implementation can unify program verification and synthesis. Our proposal is illustrated with a worked example developed in Paulson's Isabelle system. We also give examples of existent calculi that are closely related to the methodology we are proposing and others that can be profitably recast using our approach

    Explicit Substitutions for Contextual Type Theory

    Get PDF
    In this paper, we present an explicit substitution calculus which distinguishes between ordinary bound variables and meta-variables. Its typing discipline is derived from contextual modal type theory. We first present a dependently typed lambda calculus with explicit substitutions for ordinary variables and explicit meta-substitutions for meta-variables. We then present a weak head normalization procedure which performs both substitutions lazily and in a single pass thereby combining substitution walks for the two different classes of variables. Finally, we describe a bidirectional type checking algorithm which uses weak head normalization and prove soundness.Comment: In Proceedings LFMTP 2010, arXiv:1009.218
    corecore