10,750 research outputs found

    A Combination Framework for Complexity

    Get PDF
    In this paper we present a combination framework for polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity. We employ the framework in the automated complexity tool TCT. TCT implements a majority of the techniques found in the literature, witnessing that our framework is general enough to capture a very brought setting

    Termination of Rewriting with and Automated Synthesis of Forbidden Patterns

    Full text link
    We introduce a modified version of the well-known dependency pair framework that is suitable for the termination analysis of rewriting under forbidden pattern restrictions. By attaching contexts to dependency pairs that represent the calling contexts of the corresponding recursive function calls, it is possible to incorporate the forbidden pattern restrictions in the (adapted) notion of dependency pair chains, thus yielding a sound and complete approach to termination analysis. Building upon this contextual dependency pair framework we introduce a dependency pair processor that simplifies problems by analyzing the contextual information of the dependency pairs. Moreover, we show how this processor can be used to synthesize forbidden patterns suitable for a given term rewriting system on-the-fly during the termination analysis.Comment: In Proceedings IWS 2010, arXiv:1012.533

    Link-time smart card code hardening

    Get PDF
    This paper presents a feasibility study to protect smart card software against fault-injection attacks by means of link-time code rewriting. This approach avoids the drawbacks of source code hardening, avoids the need for manual assembly writing, and is applicable in conjunction with closed third-party compilers. We implemented a range of cookbook code hardening recipes in a prototype link-time rewriter and evaluate their coverage and associated overhead to conclude that this approach is promising. We demonstrate that the overhead of using an automated link-time approach is not significantly higher than what can be obtained with compile-time hardening or with manual hardening of compiler-generated assembly code

    SensorCloud: Towards the Interdisciplinary Development of a Trustworthy Platform for Globally Interconnected Sensors and Actuators

    Get PDF
    Although Cloud Computing promises to lower IT costs and increase users' productivity in everyday life, the unattractive aspect of this new technology is that the user no longer owns all the devices which process personal data. To lower scepticism, the project SensorCloud investigates techniques to understand and compensate these adoption barriers in a scenario consisting of cloud applications that utilize sensors and actuators placed in private places. This work provides an interdisciplinary overview of the social and technical core research challenges for the trustworthy integration of sensor and actuator devices with the Cloud Computing paradigm. Most importantly, these challenges include i) ease of development, ii) security and privacy, and iii) social dimensions of a cloud-based system which integrates into private life. When these challenges are tackled in the development of future cloud systems, the attractiveness of new use cases in a sensor-enabled world will considerably be increased for users who currently do not trust the Cloud.Comment: 14 pages, 3 figures, published as technical report of the Department of Computer Science of RWTH Aachen Universit

    Applications and extensions of context-sensitive rewriting

    Full text link
    [EN] Context-sensitive rewriting is a restriction of term rewriting which is obtained by imposing replacement restrictions on the arguments of function symbols. It has proven useful to analyze computational properties of programs written in sophisticated rewriting-based programming languages such asCafeOBJ, Haskell, Maude, OBJ*, etc. Also, a number of extensions(e.g., to conditional rewritingor constrained equational systems) and generalizations(e.g., controlled rewritingor forbidden patterns) of context-sensitive rewriting have been proposed. In this paper, we provide an overview of these applications and related issues. (C) 2021 Elsevier Inc. All rights reserved.Partially supported by the EU (FEDER), and projects RTI2018-094403-B-C32 and PROMETEO/2019/098.Lucas Alba, S. (2021). Applications and extensions of context-sensitive rewriting. Journal of Logical and Algebraic Methods in Programming. 121:1-33. https://doi.org/10.1016/j.jlamp.2021.10068013312

    Derivational Complexity and Context-Sensitive Rewriting

    Full text link
    [EN] Context-sensitive rewriting is a restriction of rewriting where reduction steps are allowed on specific arguments mu(f) subset of {1, ..., k} of k-ary function symbols f only. Terms which cannot be further rewritten in this way are called mu-normal forms. For left-linear term rewriting systems (TRSs), the so-called normalization via mu-normalization procedure provides a systematic way to obtain normal forms by the stepwise computation and combination of intermediate mu-normal forms. In this paper, we show how to obtain bounds on the derivational complexity of computations using this procedure by using bounds on the derivational complexity of context-sensitive rewriting. Two main applications are envisaged: Normalization via mu-normalization can be used with non-terminating TRSs where the procedure still terminates; on the other hand, it can be used to improve on bounds of derivational complexity of terminating TRSs as it discards many rewritings.Partially supported by the EU (FEDER), and projects RTI2018-094403-B-C32 and PROMETEO/2019/098.Lucas Alba, S. (2021). Derivational Complexity and Context-Sensitive Rewriting. Journal of Automated Reasoning. 65(8):1191-1229. https://doi.org/10.1007/s10817-021-09603-11191122965

    Conditional Complexity

    Get PDF
    We propose a notion of complexity for oriented conditional term rewrite systems. This notion is realistic in the sense that it measures not only successful computations but also partial computations that result in a failed rule application. A transformation to unconditional context-sensitive rewrite systems is presented which reflects this complexity notion, as well as a technique to derive runtime and derivational complexity bounds for the latter

    Proving Looping and Non-Looping Non-Termination by Finite Automata

    Get PDF
    A new technique is presented to prove non-termination of term rewriting. The basic idea is to find a non-empty regular language of terms that is closed under rewriting and does not contain normal forms. It is automated by representing the language by a tree automaton with a fixed number of states, and expressing the mentioned requirements in a SAT formula. Satisfiability of this formula implies non-termination. Our approach succeeds for many examples where all earlier techniques fail, for instance for the S-rule from combinatory logic

    Comparing and evaluating extended Lambek calculi

    Get PDF
    Lambeks Syntactic Calculus, commonly referred to as the Lambek calculus, was innovative in many ways, notably as a precursor of linear logic. But it also showed that we could treat our grammatical framework as a logic (as opposed to a logical theory). However, though it was successful in giving at least a basic treatment of many linguistic phenomena, it was also clear that a slightly more expressive logical calculus was needed for many other cases. Therefore, many extensions and variants of the Lambek calculus have been proposed, since the eighties and up until the present day. As a result, there is now a large class of calculi, each with its own empirical successes and theoretical results, but also each with its own logical primitives. This raises the question: how do we compare and evaluate these different logical formalisms? To answer this question, I present two unifying frameworks for these extended Lambek calculi. Both are proof net calculi with graph contraction criteria. The first calculus is a very general system: you specify the structure of your sequents and it gives you the connectives and contractions which correspond to it. The calculus can be extended with structural rules, which translate directly into graph rewrite rules. The second calculus is first-order (multiplicative intuitionistic) linear logic, which turns out to have several other, independently proposed extensions of the Lambek calculus as fragments. I will illustrate the use of each calculus in building bridges between analyses proposed in different frameworks, in highlighting differences and in helping to identify problems.Comment: Empirical advances in categorial grammars, Aug 2015, Barcelona, Spain. 201
    • …
    corecore