133 research outputs found

    Abstract Canonical Inference

    Full text link
    An abstract framework of canonical inference is used to explore how different proof orderings induce different variants of saturation and completeness. Notions like completion, paramodulation, saturation, redundancy elimination, and rewrite-system reduction are connected to proof orderings. Fairness of deductive mechanisms is defined in terms of proof orderings, distinguishing between (ordinary) "fairness," which yields completeness, and "uniform fairness," which yields saturation.Comment: 28 pages, no figures, to appear in ACM Trans. on Computational Logi

    Modal Kleene algebra and applications - a survey

    Get PDF
    Modal Kleene algebras are Kleene algebras with forward and backward modal operators defined via domain and codomain operations. They provide a concise and convenient algebraic framework that subsumes various other calculi and allows treating quite a variety of areas. We survey the basic theory and some prominent applications. These include, on the system semantics side, Hoare logic and PDL (Propositional Dynamic Logic), wp calculus and predicate transformer semantics, temporal logics and termination analysis of rewrite and state transition systems. On the derivation side we apply the framework to game analysis and greedy-like algorithms

    Bibliographie

    Get PDF

    13th international workshop on expressiveness in concurrency

    Get PDF

    Lambda-calculus and formal language theory

    Get PDF
    Formal and symbolic approaches have offered computer science many application fields. The rich and fruitful connection between logic, automata and algebra is one such approach. It has been used to model natural languages as well as in program verification. In the mathematics of language it is able to model phenomena ranging from syntax to phonology while in verification it gives model checking algorithms to a wide family of programs. This thesis extends this approach to simply typed lambda-calculus by providing a natural extension of recognizability to programs that are representable by simply typed terms. This notion is then applied to both the mathematics of language and program verification. In the case of the mathematics of language, it is used to generalize parsing algorithms and to propose high-level methods to describe languages. Concerning program verification, it is used to describe methods for verifying the behavioral properties of higher-order programs. In both cases, the link that is drawn between finite state methods and denotational semantics provide the means to mix powerful tools coming from the two worlds

    Problem Theory

    Full text link
    The Turing machine, as it was presented by Turing himself, models the calculations done by a person. This means that we can compute whatever any Turing machine can compute, and therefore we are Turing complete. The question addressed here is why, Why are we Turing complete? Being Turing complete also means that somehow our brain implements the function that a universal Turing machine implements. The point is that evolution achieved Turing completeness, and then the explanation should be evolutionary, but our explanation is mathematical. The trick is to introduce a mathematical theory of problems, under the basic assumption that solving more problems provides more survival opportunities. So we build a problem theory by fusing set and computing theories. Then we construct a series of resolvers, where each resolver is defined by its computing capacity, that exhibits the following property: all problems solved by a resolver are also solved by the next resolver in the series if certain condition is satisfied. The last of the conditions is to be Turing complete. This series defines a resolvers hierarchy that could be seen as a framework for the evolution of cognition. Then the answer to our question would be: to solve most problems. By the way, the problem theory defines adaptation, perception, and learning, and it shows that there are just three ways to resolve any problem: routine, trial, and analogy. And, most importantly, this theory demonstrates how problems can be used to found mathematics and computing on biology.Comment: 43 page

    Confluence Analysis for a Graph Programming Language

    Get PDF
    GP 2 is a high-level domain-specific language for programming with graphs. Users write a set of graph transformation rules and organise them with imperative-style control constructs to perform a desired computation on an input graph. As rule selection and matching are non-deterministic, there might be different graphs resulting from program execution. Confluence is a property that establishes the global determinism of a computation despite possible local non-determinism. Conventional confluence analysis is done via so-called critical pairs, which are conflicts in minimal context. A key challenge is extending critical pairs to the setting of GP 2. This thesis concerns the development of confluence analysis for GP 2. First, we extend the notion of conflict to GP 2 rules, and prove that non-conflicting rule applications commute. Second, we define symbolic critical pairs and establish their properties, namely that there are only finitely many of them and that they represent all possible conflicts. We give an effective procedure for their construction. Third, we solve the problem of unifying GP 2 list expressions, which arises during the construction of critical pairs, by giving a unification procedure which terminates with a finite and complete set of unifiers (under certain restrictions). Last but not least, we specify a confluence analysis algorithm based on symbolic critical pairs, and show its soundness by proving the Local Confluence Theorem. Several existing programs are analysed for confluence to demonstrate how the analysis handles several GP 2 features at the same time, and to demonstrate the merit of the used techniques
    • …
    corecore