10 research outputs found
Subset feedback vertex set is fixed parameter tractable
The classical Feedback Vertex Set problem asks, for a given undirected graph
G and an integer k, to find a set of at most k vertices that hits all the
cycles in the graph G. Feedback Vertex Set has attracted a large amount of
research in the parameterized setting, and subsequent kernelization and
fixed-parameter algorithms have been a rich source of ideas in the field.
In this paper we consider a more general and difficult version of the
problem, named Subset Feedback Vertex Set (SUBSET-FVS in short) where an
instance comes additionally with a set S ? V of vertices, and we ask for a set
of at most k vertices that hits all simple cycles passing through S. Because of
its applications in circuit testing and genetic linkage analysis SUBSET-FVS was
studied from the approximation algorithms perspective by Even et al.
[SICOMP'00, SIDMA'00].
The question whether the SUBSET-FVS problem is fixed-parameter tractable was
posed independently by Kawarabayashi and Saurabh in 2009. We answer this
question affirmatively. We begin by showing that this problem is
fixed-parameter tractable when parametrized by |S|. Next we present an
algorithm which reduces the given instance to 2^k n^O(1) instances with the
size of S bounded by O(k^3), using kernelization techniques such as the
2-Expansion Lemma, Menger's theorem and Gallai's theorem. These two facts allow
us to give a 2^O(k log k) n^O(1) time algorithm solving the Subset Feedback
Vertex Set problem, proving that it is indeed fixed-parameter tractable.Comment: full version of a paper presented at ICALP'1
Detecting Feedback Vertex Sets of Size in Time
In the Feedback Vertex Set problem, one is given an undirected graph and
an integer , and one needs to determine whether there exists a set of
vertices that intersects all cycles of (a so-called feedback vertex set).
Feedback Vertex Set is one of the most central problems in parameterized
complexity: It served as an excellent test bed for many important algorithmic
techniques in the field such as Iterative Compression~[Guo et al. (JCSS'06)],
Randomized Branching~[Becker et al. (J. Artif. Intell. Res'00)] and
Cut\&Count~[Cygan et al. (FOCS'11)]. In particular, there has been a long race
for the smallest dependence in run times of the type ,
where the notation omits factors polynomial in . This race seemed
to be run in 2011, when a randomized algorithm time algorithm
based on Cut\&Count was introduced.
In this work, we show the contrary and give a time
randomized algorithm. Our algorithm combines all mentioned techniques with
substantial new ideas: First, we show that, given a feedback vertex set of size
of bounded average degree, a tree decomposition of width
can be found in polynomial time. Second, we give a randomized branching
strategy inspired by the one from~[Becker et al. (J. Artif. Intell. Res'00)] to
reduce to the aforementioned bounded average degree setting. Third, we obtain
significant run time improvements by employing fast matrix multiplication.Comment: SODA 2020, 22 page
Lossy Kernelization
In this paper we propose a new framework for analyzing the performance of
preprocessing algorithms. Our framework builds on the notion of kernelization
from parameterized complexity. However, as opposed to the original notion of
kernelization, our definitions combine well with approximation algorithms and
heuristics. The key new definition is that of a polynomial size
-approximate kernel. Loosely speaking, a polynomial size
-approximate kernel is a polynomial time pre-processing algorithm that
takes as input an instance to a parameterized problem, and outputs
another instance to the same problem, such that . Additionally, for every , a -approximate solution
to the pre-processed instance can be turned in polynomial time into a
-approximate solution to the original instance .
Our main technical contribution are -approximate kernels of
polynomial size for three problems, namely Connected Vertex Cover, Disjoint
Cycle Packing and Disjoint Factors. These problems are known not to admit any
polynomial size kernels unless . Our approximate
kernels simultaneously beat both the lower bounds on the (normal) kernel size,
and the hardness of approximation lower bounds for all three problems. On the
negative side we prove that Longest Path parameterized by the length of the
path and Set Cover parameterized by the universe size do not admit even an
-approximate kernel of polynomial size, for any , unless
. In order to prove this lower bound we need to combine
in a non-trivial way the techniques used for showing kernelization lower bounds
with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and
approximate kernel lower bounds for Set Cover and Hitting Set parameterized
by universe siz