26,169 research outputs found
Proving Looping and Non-Looping Non-Termination by Finite Automata
A new technique is presented to prove non-termination of term rewriting. The
basic idea is to find a non-empty regular language of terms that is closed
under rewriting and does not contain normal forms. It is automated by
representing the language by a tree automaton with a fixed number of states,
and expressing the mentioned requirements in a SAT formula. Satisfiability of
this formula implies non-termination. Our approach succeeds for many examples
where all earlier techniques fail, for instance for the S-rule from combinatory
logic
Extending Context-Sensitivity in Term Rewriting
We propose a generalized version of context-sensitivity in term rewriting
based on the notion of "forbidden patterns". The basic idea is that a rewrite
step should be forbidden if the redex to be contracted has a certain shape and
appears in a certain context. This shape and context is expressed through
forbidden patterns. In particular we analyze the relationships among this novel
approach and the commonly used notion of context-sensitivity in term rewriting,
as well as the feasibility of rewriting with forbidden patterns from a
computational point of view. The latter feasibility is characterized by
demanding that restricting a rewrite relation yields an improved termination
behaviour while still being powerful enough to compute meaningful results.
Sufficient criteria for both kinds of properties in certain classes of rewrite
systems with forbidden patterns are presented
Termination orders for 3-dimensional rewriting
This paper studies 3-polygraphs as a framework for rewriting on
two-dimensional words. A translation of term rewriting systems into
3-polygraphs with explicit resource management is given, and the respective
computational properties of each system are studied. Finally, a convergent
3-polygraph for the (commutative) theory of Z/2Z-vector spaces is given. In
order to prove these results, it is explained how to craft a class of
termination orders for 3-polygraphs.Comment: 30 pages, 35 figure
Termination of Rewriting with and Automated Synthesis of Forbidden Patterns
We introduce a modified version of the well-known dependency pair framework
that is suitable for the termination analysis of rewriting under forbidden
pattern restrictions. By attaching contexts to dependency pairs that represent
the calling contexts of the corresponding recursive function calls, it is
possible to incorporate the forbidden pattern restrictions in the (adapted)
notion of dependency pair chains, thus yielding a sound and complete approach
to termination analysis. Building upon this contextual dependency pair
framework we introduce a dependency pair processor that simplifies problems by
analyzing the contextual information of the dependency pairs. Moreover, we show
how this processor can be used to synthesize forbidden patterns suitable for a
given term rewriting system on-the-fly during the termination analysis.Comment: In Proceedings IWS 2010, arXiv:1012.533
Infinitary Combinatory Reduction Systems: Confluence
We study confluence in the setting of higher-order infinitary rewriting, in
particular for infinitary Combinatory Reduction Systems (iCRSs). We prove that
fully-extended, orthogonal iCRSs are confluent modulo identification of
hypercollapsing subterms. As a corollary, we obtain that fully-extended,
orthogonal iCRSs have the normal form property and the unique normal form
property (with respect to reduction). We also show that, unlike the case in
first-order infinitary rewriting, almost non-collapsing iCRSs are not
necessarily confluent
On the confluence of lambda-calculus with conditional rewriting
The confluence of untyped \lambda-calculus with unconditional rewriting is
now well un- derstood. In this paper, we investigate the confluence of
\lambda-calculus with conditional rewriting and provide general results in two
directions. First, when conditional rules are algebraic. This extends results
of M\"uller and Dougherty for unconditional rewriting. Two cases are
considered, whether \beta-reduction is allowed or not in the evaluation of
conditions. Moreover, Dougherty's result is improved from the assumption of
strongly normalizing \beta-reduction to weakly normalizing \beta-reduction. We
also provide examples showing that outside these conditions, modularity of
confluence is difficult to achieve. Second, we go beyond the algebraic
framework and get new confluence results using a restricted notion of
orthogonality that takes advantage of the conditional part of rewrite rules
Termination of Rewriting with Right-Flat Rules Modulo Permutative Theories
We present decidability results for termination of classes of term rewriting
systems modulo permutative theories. Termination and innermost termination
modulo permutative theories are shown to be decidable for term rewrite systems
(TRS) whose right-hand side terms are restricted to be shallow (variables occur
at depth at most one) and linear (each variable occurs at most once). Innermost
termination modulo permutative theories is also shown to be decidable for
shallow TRS. We first show that a shallow TRS can be transformed into a flat
(only variables and constants occur at depth one) TRS while preserving
termination and innermost termination. The decidability results are then proved
by showing that (a) for right-flat right-linear (flat) TRS, non-termination
(respectively, innermost non-termination) implies non-termination starting from
flat terms, and (b) for right-flat TRS, the existence of non-terminating
derivations starting from a given term is decidable. On the negative side, we
show PSPACE-hardness of termination and innermost termination for shallow
right-linear TRS, and undecidability of termination for flat TRS.Comment: 20 page
Towards 3-Dimensional Rewriting Theory
String rewriting systems have proved very useful to study monoids. In good
cases, they give finite presentations of monoids, allowing computations on
those and their manipulation by a computer. Even better, when the presentation
is confluent and terminating, they provide one with a notion of canonical
representative of the elements of the presented monoid. Polygraphs are a
higher-dimensional generalization of this notion of presentation, from the
setting of monoids to the much more general setting of n-categories. One of the
main purposes of this article is to give a progressive introduction to the
notion of higher-dimensional rewriting system provided by polygraphs, and
describe its links with classical rewriting theory, string and term rewriting
systems in particular. After introducing the general setting, we will be
interested in proving local confluence for polygraphs presenting 2-categories
and introduce a framework in which a finite 3-dimensional rewriting system
admits a finite number of critical pairs
Complexity Hierarchies and Higher-order Cons-free Term Rewriting
Constructor rewriting systems are said to be cons-free if, roughly,
constructor terms in the right-hand sides of rules are subterms of the
left-hand sides; the computational intuition is that rules cannot build new
data structures. In programming language research, cons-free languages have
been used to characterize hierarchies of computational complexity classes; in
term rewriting, cons-free first-order TRSs have been used to characterize the
class PTIME.
We investigate cons-free higher-order term rewriting systems, the complexity
classes they characterize, and how these depend on the type order of the
systems. We prove that, for every K 1, left-linear cons-free systems
with type order K characterize ETIME if unrestricted evaluation is used
(i.e., the system does not have a fixed reduction strategy).
The main difference with prior work in implicit complexity is that (i) our
results hold for non-orthogonal term rewriting systems with no assumptions on
reduction strategy, (ii) we consequently obtain much larger classes for each
type order (ETIME versus EXPTIME), and (iii) results for cons-free
term rewriting systems have previously only been obtained for K = 1, and with
additional syntactic restrictions besides cons-freeness and left-linearity.
Our results are among the first implicit characterizations of the hierarchy E
= ETIME ETIME ... Our work confirms prior
results that having full non-determinism (via overlapping rules) does not
directly allow for characterization of non-deterministic complexity classes
like NE. We also show that non-determinism makes the classes characterized
highly sensitive to minor syntactic changes like admitting product types or
non-left-linear rules.Comment: extended version of a paper submitted to FSCD 2016. arXiv admin note:
substantial text overlap with arXiv:1604.0893
Complexity Hierarchies and Higher-Order Cons-Free Rewriting
Constructor rewriting systems are said to be cons-free if, roughly,
constructor terms in the right-hand sides of rules are subterms of constructor
terms in the left-hand side; the computational intuition is that rules cannot
build new data structures. It is well-known that cons-free programming
languages can be used to characterize computational complexity classes, and
that cons-free first-order term rewriting can be used to characterize the set
of polynomial-time decidable sets.
We investigate cons-free higher-order term rewriting systems, the complexity
classes they characterize, and how these depend on the order of the types used
in the systems. We prove that, for every k 1, left-linear cons-free
systems with type order k characterize ETIME if arbitrary evaluation is
used (i.e., the system does not have a fixed reduction strategy).
The main difference with prior work in implicit complexity is that (i) our
results hold for non-orthogonal term rewriting systems with possible rule
overlaps with no assumptions about reduction strategy, (ii) results for such
term rewriting systems have previously only been obtained for k = 1, and with
additional syntactic restrictions on top of cons-freeness and left-linearity.
Our results are apparently among the first implicit characterizations of the
hierarchy E = ETIME ETIME .... Our work
confirms prior results that having full non-determinism (via overlaps of rules)
does not directly allow characterization of non-deterministic complexity
classes like NE. We also show that non-determinism makes the classes
characterized highly sensitive to minor syntactic changes such as admitting
product types or non-left-linear rules.Comment: Extended version (with appendices) of a paper published in FSCD 201
- …