200 research outputs found
Recontamination Helps a Lot to Hunt a Rabbit
The Hunters and Rabbit game is played on a graph G where the Hunter player shoots at k vertices in every round while the Rabbit player occupies an unknown vertex and, if it is not shot, must move to a neighbouring vertex after each round. The Rabbit player wins if it can ensure that its position is never shot. The Hunter player wins otherwise. The hunter number h(G) of a graph G is the minimum integer k such that the Hunter player has a winning strategy (i.e., allowing him to win whatever be the strategy of the Rabbit player). This game has been studied in several graph classes, in particular in bipartite graphs (grids, trees, hypercubes...), but the computational complexity of computing h(G) remains open in general graphs and even in more restricted graph classes such as trees. To progress further in this study, we propose a notion of monotonicity (a well-studied and useful property in classical pursuit-evasion games such as Graph Searching games) for the Hunters and Rabbit game imposing that, roughly, a vertex that has already been shot "must not host the rabbit anymore". This allows us to obtain new results in various graph classes.
More precisely, let the monotone hunter number mh(G) of a graph G be the minimum integer k such that the Hunter player has a monotone winning strategy. We show that pw(G) ? mh(G) ? pw(G)+1 for any graph G with pathwidth pw(G), which implies that computing mh(G), or even approximating mh(G) up to an additive constant, is NP-hard. Then, we show that mh(G) can be computed in polynomial time in split graphs, interval graphs, cographs and trees. These results go through structural characterisations which allow us to relate the monotone hunter number with the pathwidth in some of these graph classes. In all cases, this allows us to specify the hunter number or to show that there may be an arbitrary gap between h and mh, i.e., that monotonicity does not help. In particular, we show that, for every k ? 3, there exists a tree T with h(T) = 2 and mh(T) = k. We conclude by proving that computing h (resp., mh) is FPT parameterised by the minimum size of a vertex cover
Efficient parameterized algorithms on structured graphs
In der klassischen Komplexitätstheorie werden worst-case Laufzeiten von Algorithmen typischerweise einzig abhängig von der Eingabegröße angegeben. In dem Kontext der parametrisierten Komplexitätstheorie versucht man die Analyse der Laufzeit dahingehend zu verfeinern, dass man zusätzlich zu der Eingabengröße noch einen Parameter berücksichtigt, welcher angibt, wie strukturiert die Eingabe bezüglich einer gewissen Eigenschaft ist. Ein parametrisierter Algorithmus nutzt dann diese beschriebene Struktur aus und erreicht so eine Laufzeit, welche schneller ist als die eines besten unparametrisierten Algorithmus, falls der Parameter klein ist.
Der erste Hauptteil dieser Arbeit führt die Forschung in diese Richtung weiter aus und untersucht den Einfluss von verschieden Parametern auf die Laufzeit von bekannten effizient lösbaren Problemen. Einige vorgestellte Algorithmen sind dabei adaptive Algorithmen, was bedeutet, dass die Laufzeit von diesen Algorithmen mit der Laufzeit des besten unparametrisierten Algorithm für den größtmöglichen Parameterwert übereinstimmt und damit theoretisch niemals schlechter als die besten unparametrisierten Algorithmen und übertreffen diese bereits für leicht nichttriviale Parameterwerte.
Motiviert durch den allgemeinen Erfolg und der Vielzahl solcher parametrisierten Algorithmen, welche eine vielzahl verschiedener Strukturen ausnutzen, untersuchen wir im zweiten Hauptteil dieser Arbeit, wie man solche unterschiedliche homogene Strukturen zu mehr heterogenen Strukturen vereinen kann. Ausgehend von algebraischen Ausdrücken, welche benutzt werden können, um von Parametern beschriebene Strukturen zu definieren, charakterisieren wir klar und robust heterogene Strukturen und zeigen exemplarisch, wie sich die Parameter tree-depth und modular-width heterogen verbinden lassen. Wir beschreiben dazu effiziente Algorithmen auf heterogenen Strukturen mit Laufzeiten, welche im Spezialfall mit den homogenen Algorithmen übereinstimmen.In classical complexity theory, the worst-case running times of algorithms depend solely on the size of the input. In parameterized complexity the goal is to refine the analysis of the running time of an algorithm by additionally considering a parameter that measures some kind of structure in the input. A parameterized algorithm then utilizes the structure described by the parameter and achieves a running time that is faster than the best general (unparameterized) algorithm for instances of low parameter value.
In the first part of this thesis, we carry forward in this direction and investigate the influence of several parameters on the running times of well-known tractable problems.
Several presented algorithms are adaptive algorithms, meaning that they match the running time of a best unparameterized algorithm for worst-case parameter values. Thus, an adaptive parameterized algorithm is asymptotically never worse than the best unparameterized algorithm, while it outperforms the best general algorithm already for slightly non-trivial parameter values.
As illustrated in the first part of this thesis, for many problems there exist efficient parameterized algorithms regarding multiple parameters, each describing a different kind of structure.
In the second part of this thesis, we explore how to combine such homogeneous structures to more general and heterogeneous structures.
Using algebraic expressions, we define new combined graph classes
of heterogeneous structure in a clean and robust way, and we showcase this for the heterogeneous merge of the parameters tree-depth and modular-width, by presenting parameterized algorithms
on such heterogeneous graph classes and getting running times that match the homogeneous cases throughout
Computing Optimal Leaf Roots of Chordal Cographs in Linear Time
A graph G is a k-leaf power, for an integer k >= 2, if there is a tree T with
leaf set V(G) such that, for all vertices x, y in V(G), the edge xy exists in G
if and only if the distance between x and y in T is at most k. Such a tree T is
called a k-leaf root of G. The computational problem of constructing a k-leaf
root for a given graph G and an integer k, if any, is motivated by the
challenge from computational biology to reconstruct phylogenetic trees. For
fixed k, Lafond [SODA 2022] recently solved this problem in polynomial time.
In this paper, we propose to study optimal leaf roots of graphs G, that is,
the k-leaf roots of G with minimum k value. Thus, all k'-leaf roots of G
satisfy k <= k'. In terms of computational biology, seeking optimal leaf roots
is more justified as they yield more probable phylogenetic trees. Lafond's
result does not imply polynomial-time computability of optimal leaf roots,
because, even for optimal k-leaf roots, k may (exponentially) depend on the
size of G. This paper presents a linear-time construction of optimal leaf roots
for chordal cographs (also known as trivially perfect graphs). Additionally, it
highlights the importance of the parity of the parameter k and provides a
deeper insight into the differences between optimal k-leaf roots of even versus
odd k.
Keywords: k-leaf power, k-leaf root, optimal k-leaf root, trivially perfect
leaf power, chordal cographComment: 22 pages, 2 figures, full version of the FCT 2023 pape
Further results on the Hunters and Rabbit game through monotonicity
Hunters and Rabbit game is played on a graph where the Hunter player
shoots at vertices in every round while the Rabbit player occupies an
unknown vertex and, if not shot, must move to a neighbouring vertex after each
round. The Rabbit player wins if it can ensure that its position is never shot.
The Hunter player wins otherwise. The hunter number of a graph is
the minimum integer such that the Hunter player has a winning strategy
(i.e., allowing him to win whatever be the strategy of the Rabbit player). This
game has been studied in several graph classes, in particular in bipartite
graphs (grids, trees, hypercubes...), but the computational complexity of
computing remains open in general graphs and even in trees. To progress
further, we propose a notion of monotonicity for the Hunters and Rabbit game
imposing that, roughly, a vertex that has already been shot ``must not host the
rabbit anymore''. This allows us to obtain new results in various graph
classes.
Let the monotone hunter number be denoted by . We show that for any graph with pathwidth , implying
that computing , or even approximating up to an additive
constant, is NP-hard. Then, we show that can be computed in polynomial
time in split graphs, interval graphs, cographs and trees. These results go
through structural characterisations which allow us to relate the monotone
hunter number with the pathwidth in some of these graph classes. In all cases,
this allows us to specify the hunter number or to show that there may be an
arbitrary gap between and , i.e., that monotonicity does not help. In
particular, we show that, for every , there exists a tree with
and . We conclude by proving that computing (resp., )
is FPT parameterised by the minimum size of a vertex cover.Comment: A preliminary version appeared in MFCS 2023. Abstract shortened due
to Arxiv submission requirement
A survey of parameterized algorithms and the complexity of edge modification
The survey is a comprehensive overview of the developing area of parameterized algorithms for graph modification problems. It describes state of the art in kernelization, subexponential algorithms, and parameterized complexity of graph modification. The main focus is on edge modification problems, where the task is to change some adjacencies in a graph to satisfy some required properties. To facilitate further research, we list many open problems in the area.publishedVersio
Resolving Prime Modules: The Structure of Pseudo-cographs and Galled-Tree Explainable Graphs
The modular decomposition of a graph is a natural construction to capture
key features of in terms of a labeled tree whose vertices are
labeled as "series" (), "parallel" () or "prime". However, full
information of is provided by its modular decomposition tree only,
if is a cograph, i.e., does not contain prime modules. In this case,
explains , i.e., if and only if the lowest common
ancestor of and has label "". Pseudo-cographs,
or, more general, GaTEx graphs are graphs that can be explained by labeled
galled-trees, i.e., labeled networks that are obtained from the modular
decomposition tree of by replacing the prime vertices in by
simple labeled cycles. GaTEx graphs can be recognized and labeled galled-trees
that explain these graphs can be constructed in linear time.
In this contribution, we provide a novel characterization of GaTEx graphs in
terms of a set of 25 forbidden induced subgraphs.
This characterization, in turn, allows us to show that GaTEx graphs are closely
related to many other well-known graph classes such as -sparse and
-reducible graphs, weakly-chordal graphs, perfect graphs with perfect
order, comparability and permutation graphs, murky graphs as well as interval
graphs, Meyniel graphs or very strongly-perfect and brittle graphs. Moreover,
we show that every GaTEx graph as twin-width at most 1.Comment: 18 pages, 3 figure
Streaming deletion problems parameterized by vertex cover
Streaming is a model where an input graph is provided one edge at a time, instead of being able to inspect it at will. In this work, we take a parameterized approach by assuming a vertex cover of the graph is given, building on work of Bishnu et al. [COCOON 2020]. We show the further potency of combining this parameter with the Adjacency List streaming model to obtain results for vertex deletion problems. This includes kernels, parameterized algorithms, and lower bounds for the problems of Π-free Deletion, H-free Deletion, and the more specific forms of Cluster Vertex Deletion and Odd Cycle Transversal. We focus on the complexity in terms of the number of passes over the input stream, and the memory used. This leads to a pass/memory trade-off, where a different algorithm might be favourable depending on the context and instance. We also discuss implications for parameterized complexity in the non-streaming setting
Computing Well-Covered Vector Spaces of Graphs using Modular Decomposition
A graph is well-covered if all its maximal independent sets have the same
cardinality. This well studied concept was introduced by Plummer in 1970 and
naturally generalizes to the weighted case. Given a graph , a real-valued
vertex weight function is said to be a well-covered weighting of if all
its maximal independent sets are of the same weight. The set of all
well-covered weightings of a graph forms a vector space over the field of
real numbers, called the well-covered vector space of . Since the problem of
recognizing well-covered graphs is --complete, the
problem of computing the well-covered vector space of a given graph is
--hard. Levit and Tankus showed in 2015 that the
problem admits a polynomial-time algorithm in the class of claw-free graph. In
this paper, we give two general reductions for the problem, one based on
anti-neighborhoods and one based on modular decomposition, combined with
Gaussian elimination. Building on these results, we develop a polynomial-time
algorithm for computing the well-covered vector space of a given fork-free
graph, generalizing the result of Levit and Tankus. Our approach implies that
well-covered fork-free graphs can be recognized in polynomial time and also
generalizes some known results on cographs.Comment: 25 page
- …