332 research outputs found

    Constraint Handling Rules with Binders, Patterns and Generic Quantification

    Full text link
    Constraint Handling Rules provide descriptions for constraint solvers. However, they fall short when those constraints specify some binding structure, like higher-rank types in a constraint-based type inference algorithm. In this paper, the term syntax of constraints is replaced by λ\lambda-tree syntax, in which binding is explicit; and a new ∇\nabla generic quantifier is introduced, which is used to create new fresh constants.Comment: Paper presented at the 33nd International Conference on Logic Programming (ICLP 2017), Melbourne, Australia, August 28 to September 1, 2017 16 pages, LaTeX, no PDF figure

    Towards an Efficient Evaluation of General Queries

    Get PDF
    Database applications often require to evaluate queries containing quantifiers or disjunctions, e.g., for handling general integrity constraints. Existing efficient methods for processing quantifiers depart from the relational model as they rely on non-algebraic procedures. Looking at quantified query evaluation from a new angle, we propose an approach to process quantifiers that makes use of relational algebra operators only. Our approach performs in two phases. The first phase normalizes the queries producing a canonical form. This form permits to improve the translation into relational algebra performed during the second phase. The improved translation relies on a new operator - the complement-join - that generalizes the set difference, on algebraic expressions of universal quantifiers that avoid the expensive division operator in many cases, and on a special processing of disjunctions by means of constrained outer-joins. Our method achieves an efficiency at least comparable with that of previous proposals, better in most cases. Furthermore, it is considerably simpler to implement as it completely relies on relational data structures and operators

    The dagger lambda calculus

    Full text link
    We present a novel lambda calculus that casts the categorical approach to the study of quantum protocols into the rich and well established tradition of type theory. Our construction extends the linear typed lambda calculus with a linear negation of "trivialised" De Morgan duality. Reduction is realised through explicit substitution, based on a symmetric notion of binding of global scope, with rules acting on the entire typing judgement instead of on a specific subterm. Proofs of subject reduction, confluence, strong normalisation and consistency are provided, and the language is shown to be an internal language for dagger compact categories.Comment: In Proceedings QPL 2014, arXiv:1412.810

    Strategic term rewriting and its application to a VDM-SL to SQL conversion

    Get PDF
    We constructed a tool, called VooDooM, which converts datatypes in Vdm-sl into Sql relational data models. The conversion involves transformation of algebraic types to maps and products, and pointer introduction. The conversion is specified as a theory of refinement by calculation. The implementation technology is strategic term rewriting in Haskell, as supported by the Strafunski bundle. Due to these choices of theory and technology, the road from theory to practise is straightforward.Fundação para a Ciência e a Tecnologia (FCT) - POSI/ICHS/44304/2002Agência de Inovação (ADI) - ∑!223

    The design and implementation of a relational programming system.

    Get PDF
    The declarative class of computer languages consists mainly of two paradigms - the logic and the functional. Much research has been devoted in recent years to the integration of the two with the aim of securing the advantages of both without retaining their disadvantages. To date this research has, arguably, been less fruitful than initially hoped. A large number of composite functional/logical languages have been proposed but have generally been marred by the lack of a firm, cohesive, mathematical basis. More recently new declarative paradigms, equational and constraint languages, have been advocated. These however do not fully encompass those features we perceive as being central to functional and logic languages. The crucial functional features are higher-order definitions, static polymorphic typing, applicative expressions and laziness. The crucial logic features are ability to reason about both functional and non-functional relationships and to handle computations involving search. This thesis advocates a new declarative paradigm which lies midway between functional and logic languages - the so-called relational paradigm. In a relationallanguage program and data alike are denoted by relations. All expressions are relations constructed from simpler expressions using operators which form a relational algebra. The impetus for use of relations in a declarative language comes from observations concerning their connection to functional and logic programming. Relations are mathematically more general than functions modelling non-functional as well as functional relationships. They also form the basis of many logic languages, for example, Prolog. This thesis proposes a new relational language based entirely on binary relations, named Drusilla. We demonstrate the functional and logic aspects of Drusilla. It retains the higher-order objects and polymorphism found in modern functional languages but handles non-determinism and models relationships between objects in the manner of a logic language with notion of algorithm being composed of logic and control elements. Different programming styles - functional, logic and relational- are illustrated. However, such expressive power does not come for free; it has associated with it a high cost of implementation. Two main techniques are used in the necessarily complex language interpreter. A type inference system checks programs to ensure they are meaningful and simultaneously performs automatic representation selection for relations. A symbolic manipulation system transforms programs to improve. efficiency of expressions and to increase the number of possible representations for relations while preserving program meaning

    Translating Alloy to SMT-LIB

    Get PDF
    Alloy is a tool for writing specifications and constructing instances of these specifications, based on relational logic. Satisfiability Modulo Theories (SMT) solvers embody another popular approach to specification and instance-generation; most solvers implement a language based on the SMT-LIB standard. The Alloy language and the core of SMT-LIB are each formally equivalent, over finite structures, to first-order mathematical logic; however, they support quite different modeling idioms. To help bridge this gap we have initiated a project to construct a translation from Alloy to SMT-LIB

    Infinitary λ\lambda-Calculi from a Linear Perspective (Long Version)

    Get PDF
    We introduce a linear infinitary λ\lambda-calculus, called ℓΛ∞\ell\Lambda_{\infty}, in which two exponential modalities are available, the first one being the usual, finitary one, the other being the only construct interpreted coinductively. The obtained calculus embeds the infinitary applicative λ\lambda-calculus and is universal for computations over infinite strings. What is particularly interesting about ℓΛ∞\ell\Lambda_{\infty}, is that the refinement induced by linear logic allows to restrict both modalities so as to get calculi which are terminating inductively and productive coinductively. We exemplify this idea by analysing a fragment of ℓΛ\ell\Lambda built around the principles of SLL\mathsf{SLL} and 4LL\mathsf{4LL}. Interestingly, it enjoys confluence, contrarily to what happens in ordinary infinitary λ\lambda-calculi
    • …
    corecore