1,206 research outputs found
Inapproximability of the Standard Pebble Game and Hard to Pebble Graphs
Pebble games are single-player games on DAGs involving placing and moving
pebbles on nodes of the graph according to a certain set of rules. The goal is
to pebble a set of target nodes using a minimum number of pebbles. In this
paper, we present a possibly simpler proof of the result in [CLNV15] and
strengthen the result to show that it is PSPACE-hard to determine the minimum
number of pebbles to an additive term for all , which improves upon the currently known additive constant hardness of
approximation [CLNV15] in the standard pebble game. We also introduce a family
of explicit, constant indegree graphs with nodes where there exists a graph
in the family such that using constant pebbles requires moves
to pebble in both the standard and black-white pebble games. This independently
answers an open question summarized in [Nor15] of whether a family of DAGs
exists that meets the upper bound of moves using constant pebbles
with a different construction than that presented in [AdRNV17].Comment: Preliminary version in WADS 201
Phase Transition in Matched Formulas and a Heuristic for Biclique Satisfiability
A matched formula is a CNF formula whose incidence graph admits a matching
which matches a distinct variable to every clause. We study phase transition in
a context of matched formulas and their generalization of biclique satisfiable
formulas. We have performed experiments to find a phase transition of property
"being matched" with respect to the ratio where is the number of
clauses and is the number of variables of the input formula . We
compare the results of experiments to a theoretical lower bound which was shown
by Franco and Gelder (2003). Any matched formula is satisfiable, moreover, it
remains satisfiable even if we change polarities of any literal occurrences.
Szeider (2005) generalized matched formulas into two classes having the same
property -- var-satisfiable and biclique satisfiable formulas. A formula is
biclique satisfiable if its incidence graph admits covering by pairwise
disjoint bounded bicliques. Recognizing if a formula is biclique satisfiable is
NP-complete. In this paper we describe a heuristic algorithm for recognizing
whether a formula is biclique satisfiable and we evaluate it by experiments on
random formulas. We also describe an encoding of the problem of checking
whether a formula is biclique satisfiable into SAT and we use it to evaluate
the performance of our heuristicComment: Conference version submitted to SOFSEM 2018
(https://beda.dcs.fmph.uniba.sk/sofsem2019/) 18 pages(17 without refernces),
3 figures, 8 tables, an algorithm pseudocod
Fully Dynamic Matching in Bipartite Graphs
Maximum cardinality matching in bipartite graphs is an important and
well-studied problem. The fully dynamic version, in which edges are inserted
and deleted over time has also been the subject of much attention. Existing
algorithms for dynamic matching (in general graphs) seem to fall into two
groups: there are fast (mostly randomized) algorithms that do not achieve a
better than 2-approximation, and there slow algorithms with \O(\sqrt{m})
update time that achieve a better-than-2 approximation. Thus the obvious
question is whether we can design an algorithm -- deterministic or randomized
-- that achieves a tradeoff between these two: a approximation
and a better-than-2 approximation simultaneously. We answer this question in
the affirmative for bipartite graphs.
Our main result is a fully dynamic algorithm that maintains a 3/2 + \eps
approximation in worst-case update time O(m^{1/4}\eps^{/2.5}). We also give
stronger results for graphs whose arboricity is at most \al, achieving a (1+
\eps) approximation in worst-case time O(\al (\al + \log n)) for constant
\eps. When the arboricity is constant, this bound is and when the
arboricity is polylogarithmic the update time is also polylogarithmic.
The most important technical developement is the use of an intermediate graph
we call an edge degree constrained subgraph (EDCS). This graph places
constraints on the sum of the degrees of the endpoints of each edge: upper
bounds for matched edges and lower bounds for unmatched edges. The main
technical content of our paper involves showing both how to maintain an EDCS
dynamically and that and EDCS always contains a sufficiently large matching. We
also make use of graph orientations to help bound the amount of work done
during each update.Comment: Longer version of paper that appears in ICALP 201
Testing the Equivalence of Regular Languages
The minimal deterministic finite automaton is generally used to determine
regular languages equality. Antimirov and Mosses proposed a rewrite system for
deciding regular expressions equivalence of which Almeida et al. presented an
improved variant. Hopcroft and Karp proposed an almost linear algorithm for
testing the equivalence of two deterministic finite automata that avoids
minimisation. In this paper we improve the best-case running time, present an
extension of this algorithm to non-deterministic finite automata, and establish
a relationship between this algorithm and the one proposed in Almeida et al. We
also present some experimental comparative results. All these algorithms are
closely related with the recent coalgebraic approach to automata proposed by
Rutten
Maximal Sharing in the Lambda Calculus with letrec
Increasing sharing in programs is desirable to compactify the code, and to
avoid duplication of reduction work at run-time, thereby speeding up execution.
We show how a maximal degree of sharing can be obtained for programs expressed
as terms in the lambda calculus with letrec. We introduce a notion of `maximal
compactness' for lambda-letrec-terms among all terms with the same infinite
unfolding. Instead of defined purely syntactically, this notion is based on a
graph semantics. lambda-letrec-terms are interpreted as first-order term graphs
so that unfolding equivalence between terms is preserved and reflected through
bisimilarity of the term graph interpretations. Compactness of the term graphs
can then be compared via functional bisimulation.
We describe practical and efficient methods for the following two problems:
transforming a lambda-letrec-term into a maximally compact form; and deciding
whether two lambda-letrec-terms are unfolding-equivalent. The transformation of
a lambda-letrec-term into maximally compact form proceeds in three
steps:
(i) translate L into its term graph ; (ii) compute the maximally
shared form of as its bisimulation collapse ; (iii) read back a
lambda-letrec-term from the term graph with the property . This guarantees that and have the same unfolding, and that
exhibits maximal sharing.
The procedure for deciding whether two given lambda-letrec-terms and
are unfolding-equivalent computes their term graph interpretations and , and checks whether these term graphs are bisimilar.
For illustration, we also provide a readily usable implementation.Comment: 18 pages, plus 19 pages appendi
Checking NFA equivalence with bisimulations up to congruence
16pInternational audienceWe introduce bisimulation up to congruence as a technique for proving language equivalence of non-deterministic finite automata. Exploiting this technique, we devise an optimisation of the classical algorithm by Hopcroft and Karp. We compare our algorithm to the recently introduced antichain algorithms, by analysing and relating the two underlying coinductive proof methods. We give concrete examples where we exponentially improve over antichains; experimental results moreover show non negligible improvements on random automata
The finite tiling problem is undecidable in the hyperbolic plane
In this paper, we consider the finite tiling problem which was proved
undecidable in the Euclidean plane by Jarkko Kari in 1994. Here, we prove that
the same problem for the hyperbolic plane is also undecidable
Applications of Automata and Graphs: Labeling-Operators in Hilbert Space I
We show that certain representations of graphs by operators on Hilbert space
have uses in signal processing and in symbolic dynamics. Our main result is
that graphs built on automata have fractal characteristics. We make this
precise with the use of Representation Theory and of Spectral Theory of a
certain family of Hecke operators. Let G be a directed graph. We begin by
building the graph groupoid G induced by G, and representations of G. Our main
application is to the groupoids defined from automata. By assigning weights to
the edges of a fixed graph G, we give conditions for G to acquire fractal-like
properties, and hence we can have fractaloids or G-fractals. Our standing
assumption on G is that it is locally finite and connected, and our labeling of
G is determined by the "out-degrees of vertices". From our labeling, we arrive
at a family of Hecke-type operators whose spectrum is computed. As
applications, we are able to build representations by operators on Hilbert
spaces (including the Hecke operators); and we further show that automata built
on a finite alphabet generate fractaloids. Our Hecke-type operators, or
labeling operators, come from an amalgamated free probability construction, and
we compute the corresponding amalgamated free moments. We show that the free
moments are completely determined by certain scalar-valued functions.Comment: 69 page
Inverse monoids and immersions of 2-complexes
It is well known that under mild conditions on a connected topological space
, connected covers of may be classified via conjugacy
classes of subgroups of the fundamental group of . In this paper,
we extend these results to the study of immersions into 2-dimensional
CW-complexes. An immersion between
CW-complexes is a cellular map such that each point has a
neighborhood that is mapped homeomorphically onto by . In order
to classify immersions into a 2-dimensional CW-complex , we need to
replace the fundamental group of by an appropriate inverse monoid.
We show how conjugacy classes of the closed inverse submonoids of this inverse
monoid may be used to classify connected immersions into the complex
- …