62 research outputs found
Randomized contractions meet lean decompositions
We show an algorithm that, given an -vertex graph and a parameter ,
in time finds a tree decomposition of with the
following properties:
* every adhesion of the tree decomposition is of size at most , and
* every bag of the tree decomposition is -unbreakable in for every
.
Here, a set is -unbreakable in if for every
separation of order at most in , we have or
. The resulting tree decomposition has arguably best
possible adhesion size boundsand unbreakability guarantees. Furthermore, the
parametric factor in the running time bound is significantly smaller than in
previous similar constructions. These improvements allow us to present
parameterized algorithms for Minimum Bisection, Steiner Cut, and Steiner
Multicut with improved parameteric factor in the running time bound.
The main technical insight is to adapt the notion of lean decompositions of
Thomas and the subsequent construction algorithm of Bellenbaum and Diestel to
the parameterized setting.Comment: v2: New co-author (Magnus) and improved results on vertex
unbreakability of bags, v3: final changes, including new abstrac
Parameterized Complexity of Fair Bisection: FPT-Approximation meets Unbreakability
In the Minimum Bisection problem, input is a graph and the goal is to
partition the vertex set into two parts and , such that and the number of edges between and is minimized. This problem
can be viewed as a clustering problem where edges represent similarity, and the
task is to partition the vertices into two equally sized clusters, while
minimizing the number of pairs of similar objects that end up in different
clusters. In this paper, we initiate the study of a fair version of Minimum
Bisection. In this problem, the vertices of the graph are colored using one of
colors. The goal is to find a bisection with at most
edges between the parts, such that for each color , has exactly
vertices of color .
We first show that Fair Bisection is [1]-hard parameterized by even
when . On the other hand, our main technical contribution shows that is
that this hardness result is simply a consequence of the very strict
requirement that each color class has {\em exactly} vertices in .
In particular, we give an time algorithm that finds a
balanced partition with at most edges between them, such that for
each color , there are at most vertices of color
in . Our approximation algorithm is best viewed as a proof of concept
that the technique introduced by [Lampis, ICALP '18] for obtaining
FPT-approximation algorithms for problems of bounded tree-width or clique-width
can be efficiently exploited even on graphs of unbounded width. The key insight
is that the technique of Lampis is applicable on tree decompositions with
unbreakable bags (as introduced in [Cygan et al., SIAM Journal on Computing
'14]). Along the way, we also derive a combinatorial result regarding tree
decompositions of graphs.Comment: Full version of ESA 2023 paper. Abstract shortened to meet the
character limi
Solving hard cut problems via flow-augmentation
We present a new technique for designing FPT algorithms for graph cut
problems in undirected graphs, which we call flow augmentation. Our technique
is applicable to problems that can be phrased as a search for an (edge)
-cut of cardinality at most in an undirected graph with
designated terminals and .
More precisely, we consider problems where an (unknown) solution is a set of size at most such that (1) in , and are in
distinct connected components, (2) every edge of connects two distinct
connected components of , and (3) if we define the set as these edges for which there exists an -path with
, then separates from . We prove that
in this scenario one can in randomized time add a
number of edges to the graph so that with probability no
added edge connects two components of and becomes a minimum cut
between and .
We apply our method to obtain a randomized FPT algorithm for a notorious
"hard nut" graph cut problem we call Coupled Min-Cut. This problem emerges out
of the study of FPT algorithms for Min CSP problems, and was unamenable to
other techniques for parameterized algorithms in graph cut problems, such as
Randomized Contractions, Treewidth Reduction or Shadow Removal.
To demonstrate the power of the approach, we consider more generally Min
SAT(), parameterized by the solution cost. We show that every problem
Min SAT() is either (1) FPT, (2) W[1]-hard, or (3) able to express the
soft constraint , and thereby also the min-cut problem in directed
graphs. All the W[1]-hard cases were known or immediate, and the main new
result is an FPT algorithm for a generalization of Coupled Min-Cut
Exploiting Dense Structures in Parameterized Complexity
Over the past few decades, the study of dense structures from the perspective of approximation algorithms has become a wide area of research. However, from the viewpoint of parameterized algorithm, this area is largely unexplored. In particular, properties of random samples have been successfully deployed to design approximation schemes for a number of fundamental problems on dense structures [Arora et al. FOCS 1995, Goldreich et al. FOCS 1996, Giotis and Guruswami SODA 2006, Karpinksi and Schudy STOC 2009]. In this paper, we fill this gap, and harness the power of random samples as well as structure theory to design kernelization as well as parameterized algorithms on dense structures. In particular, we obtain linear vertex kernels for Edge-Disjoint Paths, Edge Odd Cycle Transversal, Minimum Bisection, d-Way Cut, Multiway Cut and Multicut on everywhere dense graphs. In fact, these kernels are obtained by designing a polynomial-time algorithm when the corresponding parameter is at most ?(n). Additionally, we obtain a cubic kernel for Vertex-Disjoint Paths on everywhere dense graphs. In addition to kernelization results, we obtain randomized subexponential-time parameterized algorithms for Edge Odd Cycle Transversal, Minimum Bisection, and d-Way Cut. Finally, we show how all of our results (as well as EPASes for these problems) can be de-randomized
Open Problems in (Hyper)Graph Decomposition
Large networks are useful in a wide range of applications. Sometimes problem
instances are composed of billions of entities. Decomposing and analyzing these
structures helps us gain new insights about our surroundings. Even if the final
application concerns a different problem (such as traversal, finding paths,
trees, and flows), decomposing large graphs is often an important subproblem
for complexity reduction or parallelization. This report is a summary of
discussions that happened at Dagstuhl seminar 23331 on "Recent Trends in Graph
Decomposition" and presents currently open problems and future directions in
the area of (hyper)graph decomposition
On Weighted Graph Separation Problems and Flow-Augmentation
One of the first application of the recently introduced technique of\emph{flow-augmentation} [Kim et al., STOC 2022] is a fixed-parameter algorithmfor the weighted version of \textsc{Directed Feedback Vertex Set}, a landmarkproblem in parameterized complexity. In this note we explore applicability offlow-augmentation to other weighted graph separation problems parameterized bythe size of the cutset. We show the following. -- In weighted undirected graphs\textsc{Multicut} is FPT, both in the edge- and vertex-deletion version. -- Theweighted version of \textsc{Group Feedback Vertex Set} is FPT, even with anoracle access to group operations. -- The weighted version of \textsc{DirectedSubset Feedback Vertex Set} is FPT. Our study reveals \textsc{DirectedSymmetric Multicut} as the next important graph separation problem whoseparameterized complexity remains unknown, even in the unweighted setting.<br
- …