151,917 research outputs found
(HO)RPO Revisited
The notion of computability closure has been introduced for proving the
termination of the combination of higher-order rewriting and beta-reduction. It
is also used for strengthening the higher-order recursive path ordering. In the
present paper, we study in more details the relations between the computability
closure and the (higher-order) recursive path ordering. We show that the
first-order recursive path ordering is equal to an ordering naturally defined
from the computability closure. In the higher-order case, we get an ordering
containing the higher-order recursive path ordering whose well-foundedness
relies on the correctness of the computability closure. This provides a simple
way to extend the higher-order recursive path ordering to richer type systems
Implementation of higher-order absorbing boundary conditions for the Einstein equations
We present an implementation of absorbing boundary conditions for the
Einstein equations based on the recent work of Buchman and Sarbach. In this
paper, we assume that spacetime may be linearized about Minkowski space close
to the outer boundary, which is taken to be a coordinate sphere. We reformulate
the boundary conditions as conditions on the gauge-invariant
Regge-Wheeler-Zerilli scalars. Higher-order radial derivatives are eliminated
by rewriting the boundary conditions as a system of ODEs for a set of auxiliary
variables intrinsic to the boundary. From these we construct boundary data for
a set of well-posed constraint-preserving boundary conditions for the Einstein
equations in a first-order generalized harmonic formulation. This construction
has direct applications to outer boundary conditions in simulations of isolated
systems (e.g., binary black holes) as well as to the problem of
Cauchy-perturbative matching. As a test problem for our numerical
implementation, we consider linearized multipolar gravitational waves in TT
gauge, with angular momentum numbers l=2 (Teukolsky waves), 3 and 4. We
demonstrate that the perfectly absorbing boundary condition B_L of order L=l
yields no spurious reflections to linear order in perturbation theory. This is
in contrast to the lower-order absorbing boundary conditions B_L with L<l,
which include the widely used freezing-Psi_0 boundary condition that imposes
the vanishing of the Newman-Penrose scalar Psi_0.Comment: 25 pages, 9 figures. Minor clarifications. Final version to appear in
Class. Quantum Grav
Implementation of higher-order absorbing boundary conditions for the Einstein equations
We present an implementation of absorbing boundary conditions for the
Einstein equations based on the recent work of Buchman and Sarbach. In this
paper, we assume that spacetime may be linearized about Minkowski space close
to the outer boundary, which is taken to be a coordinate sphere. We reformulate
the boundary conditions as conditions on the gauge-invariant
Regge-Wheeler-Zerilli scalars. Higher-order radial derivatives are eliminated
by rewriting the boundary conditions as a system of ODEs for a set of auxiliary
variables intrinsic to the boundary. From these we construct boundary data for
a set of well-posed constraint-preserving boundary conditions for the Einstein
equations in a first-order generalized harmonic formulation. This construction
has direct applications to outer boundary conditions in simulations of isolated
systems (e.g., binary black holes) as well as to the problem of
Cauchy-perturbative matching. As a test problem for our numerical
implementation, we consider linearized multipolar gravitational waves in TT
gauge, with angular momentum numbers l=2 (Teukolsky waves), 3 and 4. We
demonstrate that the perfectly absorbing boundary condition B_L of order L=l
yields no spurious reflections to linear order in perturbation theory. This is
in contrast to the lower-order absorbing boundary conditions B_L with L<l,
which include the widely used freezing-Psi_0 boundary condition that imposes
the vanishing of the Newman-Penrose scalar Psi_0.Comment: 25 pages, 9 figures. Minor clarifications. Final version to appear in
Class. Quantum Grav
The use of proof plans in tactic synthesis
We undertake a programme of tactic synthesis. We first formalize the notion of
a tactic as a rewrite rule, then give a correctness criterion for this by means of a
reflection mechanism in the constructive type theory OYSTER. We further formalize
the notion of a tactic specification, given as a synthesis goal and a decidability
goal. We use a proof planner. CIAM. to guide the search for inductive proofs
of these, and are able to successfully synthesize several tactics in this fashion.
This involves two extensions to existing methods: context-sensitive rewriting and
higher-order wave rules. Further, we show that from a proof of the decidability
goal one may compile to a Prolog program a pseudo- tactic which may be run to
efficiently simulate the input/output behaviour of the synthetic tacti
Rewriting with strategies in ELAN: a functional semantics
Article soumis en 1999 et finalement paru en 2001.In this work, we consider term rewriting from a functional point of view. A rewrite rule is a function that can be applied to a term using an explicit application function. From this starting point, we show how to build more elaborated functions, describing first rewrite derivations, then sets of derivations. These functions, that we call strategies, can themselves be defined by rewrite rules and the construction can be iterated leading to higher-order strategies. Furthermore, the application function is itself defined using rewriting in the same spirit. We present this calculus and study its properties. Its implementation in the ELAN language is used to motivate and exemplify the whole approach. The expressiveness of ELAN is illustrated by examples of polymorphic functions and strategies
Higher-order subtyping and its decidability
AbstractWe define the typed lambda calculus Fω∧ (F-omega-meet), a natural generalization of Girard's system Fω (F-omega) with intersection types and bounded polymorphism. A novel aspect of our presentation is the use of term rewriting techniques to present intersection types, which clearly splits the computational semantics (reduction rules) from the syntax (inference rules) of the system. We establish properties such as Church-Rosser for the reduction relation on types and terms, and strong normalization for the reduction on types. We prove that types are preserved by computation (subject reduction), and that the system satisfies the minimal types property. We define algorithms for type checking and subtype checking. The development culminates with the proof of decidability of typing in Fω∧, containing the first proof of decidability of subtyping of a higher-order lambda calculus with subtyping
Higher-order port-graph rewriting
The biologically inspired framework of port-graphs has been successfully used
to specify complex systems. It is the basis of the PORGY modelling tool. To
facilitate the specification of proof normalisation procedures via graph
rewriting, in this paper we add higher-order features to the original
port-graph syntax, along with a generalised notion of graph morphism. We
provide a matching algorithm which enables to implement higher-order port-graph
rewriting in PORGY, thus one can visually study the dynamics of the systems
modelled. We illustrate the expressive power of higher-order port-graphs with
examples taken from proof-net reduction systems.Comment: In Proceedings LINEARITY 2012, arXiv:1211.348
Complexity Hierarchies and Higher-order Cons-free Term Rewriting
Constructor rewriting systems are said to be cons-free if, roughly,
constructor terms in the right-hand sides of rules are subterms of the
left-hand sides; the computational intuition is that rules cannot build new
data structures. In programming language research, cons-free languages have
been used to characterize hierarchies of computational complexity classes; in
term rewriting, cons-free first-order TRSs have been used to characterize the
class PTIME.
We investigate cons-free higher-order term rewriting systems, the complexity
classes they characterize, and how these depend on the type order of the
systems. We prove that, for every K 1, left-linear cons-free systems
with type order K characterize ETIME if unrestricted evaluation is used
(i.e., the system does not have a fixed reduction strategy).
The main difference with prior work in implicit complexity is that (i) our
results hold for non-orthogonal term rewriting systems with no assumptions on
reduction strategy, (ii) we consequently obtain much larger classes for each
type order (ETIME versus EXPTIME), and (iii) results for cons-free
term rewriting systems have previously only been obtained for K = 1, and with
additional syntactic restrictions besides cons-freeness and left-linearity.
Our results are among the first implicit characterizations of the hierarchy E
= ETIME ETIME ... Our work confirms prior
results that having full non-determinism (via overlapping rules) does not
directly allow for characterization of non-deterministic complexity classes
like NE. We also show that non-determinism makes the classes characterized
highly sensitive to minor syntactic changes like admitting product types or
non-left-linear rules.Comment: extended version of a paper submitted to FSCD 2016. arXiv admin note:
substantial text overlap with arXiv:1604.0893
Higher-Order Termination: from Kruskal to Computability
Termination is a major question in both logic and computer science. In logic,
termination is at the heart of proof theory where it is usually called strong
normalization (of cut elimination). In computer science, termination has always
been an important issue for showing programs correct. In the early days of
logic, strong normalization was usually shown by assigning ordinals to
expressions in such a way that eliminating a cut would yield an expression with
a smaller ordinal. In the early days of verification, computer scientists used
similar ideas, interpreting the arguments of a program call by a natural
number, such as their size. Showing the size of the arguments to decrease for
each recursive call gives a termination proof of the program, which is however
rather weak since it can only yield quite small ordinals. In the sixties, Tait
invented a new method for showing cut elimination of natural deduction, based
on a predicate over the set of terms, such that the membership of an expression
to the predicate implied the strong normalization property for that expression.
The predicate being defined by induction on types, or even as a fixpoint, this
method could yield much larger ordinals. Later generalized by Girard under the
name of reducibility or computability candidates, it showed very effective in
proving the strong normalization property of typed lambda-calculi..
Towards 3-Dimensional Rewriting Theory
String rewriting systems have proved very useful to study monoids. In good
cases, they give finite presentations of monoids, allowing computations on
those and their manipulation by a computer. Even better, when the presentation
is confluent and terminating, they provide one with a notion of canonical
representative of the elements of the presented monoid. Polygraphs are a
higher-dimensional generalization of this notion of presentation, from the
setting of monoids to the much more general setting of n-categories. One of the
main purposes of this article is to give a progressive introduction to the
notion of higher-dimensional rewriting system provided by polygraphs, and
describe its links with classical rewriting theory, string and term rewriting
systems in particular. After introducing the general setting, we will be
interested in proving local confluence for polygraphs presenting 2-categories
and introduce a framework in which a finite 3-dimensional rewriting system
admits a finite number of critical pairs
- …