341 research outputs found
12th International Workshop on Termination (WST 2012) : WST 2012, February 19–23, 2012, Obergurgl, Austria / ed. by Georg Moser
This volume contains the proceedings of the 12th International Workshop on Termination (WST 2012), to be held February 19–23, 2012 in Obergurgl, Austria. The goal of the Workshop on Termination is to be a venue for presentation and discussion of all topics in and around termination. In this way, the workshop tries to bridge the gaps between different communities interested and active in research in and around termination. The 12th International Workshop on Termination in Obergurgl continues the successful workshops held in St. Andrews (1993), La Bresse (1995), Ede (1997), Dagstuhl (1999), Utrecht (2001), Valencia (2003), Aachen (2004), Seattle (2006), Paris (2007), Leipzig (2009), and Edinburgh (2010). The 12th International Workshop on Termination did welcome contributions on all aspects of termination and complexity analysis. Contributions from the imperative, constraint, functional, and logic programming communities, and papers investigating applications of complexity or termination (for example in program transformation or theorem proving) were particularly welcome. We did receive 18 submissions which all were accepted. Each paper was assigned two reviewers. In addition to these 18 contributed talks, WST 2012, hosts three invited talks by Alexander Krauss, Martin Hofmann, and Fausto Spoto
A Combination Framework for Complexity
In this paper we present a combination framework for polynomial complexity
analysis of term rewrite systems. The framework covers both derivational and
runtime complexity analysis. We present generalisations of powerful complexity
techniques, notably a generalisation of complexity pairs and (weak) dependency
pairs. Finally, we also present a novel technique, called dependency graph
decomposition, that in the dependency pair setting greatly increases
modularity. We employ the framework in the automated complexity tool TCT. TCT
implements a majority of the techniques found in the literature, witnessing
that our framework is general enough to capture a very brought setting
Complexity Analysis of Precedence Terminating Infinite Graph Rewrite Systems
The general form of safe recursion (or ramified recurrence) can be expressed
by an infinite graph rewrite system including unfolding graph rewrite rules
introduced by Dal Lago, Martini and Zorzi, in which the size of every normal
form by innermost rewriting is polynomially bounded. Every unfolding graph
rewrite rule is precedence terminating in the sense of Middeldorp, Ohsaki and
Zantema. Although precedence terminating infinite rewrite systems cover all the
primitive recursive functions, in this paper we consider graph rewrite systems
precedence terminating with argument separation, which form a subclass of
precedence terminating graph rewrite systems. We show that for any precedence
terminating infinite graph rewrite system G with a specific argument
separation, both the runtime complexity of G and the size of every normal form
in G can be polynomially bounded. As a corollary, we obtain an alternative
proof of the original result by Dal Lago et al.Comment: In Proceedings TERMGRAPH 2014, arXiv:1505.06818. arXiv admin note:
text overlap with arXiv:1404.619
Proof Theory at Work: Complexity Analysis of Term Rewrite Systems
This thesis is concerned with investigations into the "complexity of term
rewriting systems". Moreover the majority of the presented work deals with the
"automation" of such a complexity analysis. The aim of this introduction is to
present the main ideas in an easily accessible fashion to make the result
presented accessible to the general public. Necessarily some technical points
are stated in an over-simplified way.Comment: Cumulative Habilitation Thesis, submitted to the University of
Innsbruc
Faithful (meta-)encodings of programmable strategies into term rewriting systems
Rewriting is a formalism widely used in computer science and mathematical
logic. When using rewriting as a programming or modeling paradigm, the rewrite
rules describe the transformations one wants to operate and rewriting
strategies are used to con- trol their application. The operational semantics
of these strategies are generally accepted and approaches for analyzing the
termination of specific strategies have been studied. We propose in this paper
a generic encoding of classic control and traversal strategies used in rewrite
based languages such as Maude, Stratego and Tom into a plain term rewriting
system. The encoding is proven sound and complete and, as a direct consequence,
estab- lished termination methods used for term rewriting systems can be
applied to analyze the termination of strategy controlled term rewriting
systems. We show that the encoding of strategies into term rewriting systems
can be easily adapted to handle many-sorted signa- tures and we use a
meta-level representation of terms to reduce the size of the encodings. The
corresponding implementation in Tom generates term rewriting systems compatible
with the syntax of termination tools such as AProVE and TTT2, tools which
turned out to be very effective in (dis)proving the termination of the
generated term rewriting systems. The approach can also be seen as a generic
strategy compiler which can be integrated into languages providing pattern
matching primitives; experiments in Tom show that applying our encoding leads
to performances comparable to the native Tom strategies
Polynomial Path Orders: A Maximal Model
This paper is concerned with the automated complexity analysis of term
rewrite systems (TRSs for short) and the ramification of these in implicit
computational complexity theory (ICC for short). We introduce a novel path
order with multiset status, the polynomial path order POP*. Essentially relying
on the principle of predicative recursion as proposed by Bellantoni and Cook,
its distinct feature is the tight control of resources on compatible TRSs: The
(innermost) runtime complexity of compatible TRSs is polynomially bounded. We
have implemented the technique, as underpinned by our experimental evidence our
approach to the automated runtime complexity analysis is not only feasible, but
compared to existing methods incredibly fast. As an application in the context
of ICC we provide an order-theoretic characterisation of the polytime
computable functions. To be precise, the polytime computable functions are
exactly the functions computable by an orthogonal constructor TRS compatible
with POP*
Improving Dependency Tuples for Almost-Sure Innermost Termination of Probabilistic Term Rewriting
Recently, we adapted the well-known dependency pair (DP) framework to a
dependency tuple (DT) framework in order to prove almost-sure innermost
termination (iAST) of probabilistic term rewriting systems. In this paper, we
improve this approach into a complete criterion for iAST by considering
positions of subterms. Based on this, we extend the probabilistic DT framework
by new transformations. Our implementation in the tool AProVE shows that they
increase its power substantially
Needed Computations Shortcutting Needed Steps
We define a compilation scheme for a constructor-based, strongly-sequential,
graph rewriting system which shortcuts some needed steps. The object code is
another constructor-based graph rewriting system. This system is normalizing
for the original system when using an innermost strategy. Consequently, the
object code can be easily implemented by eager functions in a variety of
programming languages. We modify this object code in a way that avoids total or
partial construction of the contracta of some needed steps of a computation.
When computing normal forms in this way, both memory consumption and execution
time are reduced compared to ordinary rewriting computations in the original
system.Comment: In Proceedings TERMGRAPH 2014, arXiv:1505.0681
- …