2,651 research outputs found
Edit Distance for Pushdown Automata
The edit distance between two words is the minimal number of word
operations (letter insertions, deletions, and substitutions) necessary to
transform to . The edit distance generalizes to languages
, where the edit distance from to
is the minimal number such that for every word from
there exists a word in with edit distance at
most . We study the edit distance computation problem between pushdown
automata and their subclasses. The problem of computing edit distance to a
pushdown automaton is undecidable, and in practice, the interesting question is
to compute the edit distance from a pushdown automaton (the implementation, a
standard model for programs with recursion) to a regular language (the
specification). In this work, we present a complete picture of decidability and
complexity for the following problems: (1)~deciding whether, for a given
threshold , the edit distance from a pushdown automaton to a finite
automaton is at most , and (2)~deciding whether the edit distance from a
pushdown automaton to a finite automaton is finite.Comment: An extended version of a paper accepted to ICALP 2015 with the same
title. The paper has been accepted to the LMCS journa
A random testing approach using pushdown automata
International audienceSince finite automata are in general strong abstractions of systems, many test cases which are automata traces generated uniformly at ran-dom, may be un-concretizable. This paper proposes a method extending the abovementioned testing approach to pushdown systems providing finer abstractions. Using combinatorial techniques guarantees the uniformity of generated traces. In addition, to improve the quality of the test suites, the combination of coverage criteria with random testing is investigated. The method is illustrated within both structural and model-based testing contexts
Answering Regular Path Queries on Workflow Provenance
This paper proposes a novel approach for efficiently evaluating regular path
queries over provenance graphs of workflows that may include recursion. The
approach assumes that an execution g of a workflow G is labeled with
query-agnostic reachability labels using an existing technique. At query time,
given g, G and a regular path query R, the approach decomposes R into a set of
subqueries R1, ..., Rk that are safe for G. For each safe subquery Ri, G is
rewritten so that, using the reachability labels of nodes in g, whether or not
there is a path which matches Ri between two nodes can be decided in constant
time. The results of each safe subquery are then composed, possibly with some
small unsafe remainder, to produce an answer to R. The approach results in an
algorithm that significantly reduces the number of subqueries k over existing
techniques by increasing their size and complexity, and that evaluates each
subquery in time bounded by its input and output size. Experimental results
demonstrate the benefit of this approach
Liveness-Based Garbage Collection for Lazy Languages
We consider the problem of reducing the memory required to run lazy
first-order functional programs. Our approach is to analyze programs for
liveness of heap-allocated data. The result of the analysis is used to preserve
only live data---a subset of reachable data---during garbage collection. The
result is an increase in the garbage reclaimed and a reduction in the peak
memory requirement of programs. While this technique has already been shown to
yield benefits for eager first-order languages, the lack of a statically
determinable execution order and the presence of closures pose new challenges
for lazy languages. These require changes both in the liveness analysis itself
and in the design of the garbage collector.
To show the effectiveness of our method, we implemented a copying collector
that uses the results of the liveness analysis to preserve live objects, both
evaluated (i.e., in WHNF) and closures. Our experiments confirm that for
programs running with a liveness-based garbage collector, there is a
significant decrease in peak memory requirements. In addition, a sizable
reduction in the number of collections ensures that in spite of using a more
complex garbage collector, the execution times of programs running with
liveness and reachability-based collectors remain comparable
- …