12 research outputs found
Solving Connectivity Problems Parameterized by Treedepth in Single-Exponential Time and Polynomial Space
A breakthrough result of Cygan et al. (FOCS 2011) showed that connectivity problems parameterized by treewidth can be solved much faster than the previously best known time ?^*(2^{?(twlog tw)}). Using their inspired Cut&Count technique, they obtained ?^*(?^tw) time algorithms for many such problems. Moreover, they proved these running times to be optimal assuming the Strong Exponential-Time Hypothesis. Unfortunately, like other dynamic programming algorithms on tree decompositions, these algorithms also require exponential space, and this is widely believed to be unavoidable. In contrast, for the slightly larger parameter called treedepth, there are already several examples of matching the time bounds obtained for treewidth, but using only polynomial space. Nevertheless, this has remained open for connectivity problems.
In the present work, we close this knowledge gap by applying the Cut&Count technique to graphs of small treedepth. While the general idea is unchanged, we have to design novel procedures for counting consistently cut solution candidates using only polynomial space. Concretely, we obtain time ?^*(3^d) and polynomial space for Connected Vertex Cover, Feedback Vertex Set, and Steiner Tree on graphs of treedepth d. Similarly, we obtain time ?^*(4^d) and polynomial space for Connected Dominating Set and Connected Odd Cycle Transversal
Parameterized Approximation Scheme for Feedback Vertex Set
Feedback Vertex Set (FVS) is one of the most studied vertex deletion problems in the field of graph algorithms. In the decision version of the problem, given a graph G and an integer k, the question is whether there exists a set S of at most k vertices in G such that G-S is acyclic. It is one of the first few problems which were shown to be NP-complete, and has been extensively studied from the viewpoint of approximation and parameterized algorithms. The best-known polynomial time approximation algorithm for FVS is a 2-factor approximation, while the best known deterministic and randomized FPT algorithms run in time ?^*(3.460^k) and ?^*(2.7^k) respectively.
In this paper, we contribute to the newly established area of parameterized approximation, by studying FVS in this paradigm. In particular, we combine the approaches of parameterized and approximation algorithms for the study of FVS, and achieve an approximation guarantee with a factor better than 2 in randomized FPT running time, that improves over the best known parameterized algorithm for FVS. We give three simple randomized (1+?) approximation algorithms for FVS, running in times ?^*(2^{?k}? 2.7^{(1-?)k}), ?^*(({(4/(1+?))^{(1+?)}}?{(?/3)^?})^k), and ?^*(4^{(1-?)k}) respectively for every ? ? (0,1). Combining these three algorithms, we obtain a factor (1+?) approximation algorithm for FVS, which has better running time than the best-known (randomized) FPT algorithm for every ? ? (0, 1). This is the first attempt to look at a parameterized approximation of FVS to the best of our knowledge. Our algorithms are very simple, and they rely on some well-known reduction rules used for arriving at FPT algorithms for FVS
Improved FPT Algorithms for Deletion to Forest-Like Structures
The Feedback Vertex Set problem is undoubtedly one of the most well-studied problems in Parameterized Complexity. In this problem, given an undirected graph G and a non-negative integer k, the objective is to test whether there exists a subset S ? V(G) of size at most k such that G-S is a forest. After a long line of improvement, recently, Li and Nederlof [SODA, 2020] designed a randomized algorithm for the problem running in time ?^?(2.7^k). In the Parameterized Complexity literature, several problems around Feedback Vertex Set have been studied. Some of these include Independent Feedback Vertex Set (where the set S should be an independent set in G), Almost Forest Deletion and Pseudoforest Deletion. In Pseudoforest Deletion, each connected component in G-S has at most one cycle in it. However, in Almost Forest Deletion, the input is a graph G and non-negative integers k,? ? ?, and the objective is to test whether there exists a vertex subset S of size at most k, such that G-S is ? edges away from a forest. In this paper, using the methodology of Li and Nederlof [SODA, 2020], we obtain the current fastest algorithms for all these problems. In particular we obtain following randomized algorithms.
1) Independent Feedback Vertex Set can be solved in time ?^?(2.7^k).
2) Pseudo Forest Deletion can be solved in time ?^?(2.85^k).
3) Almost Forest Deletion can be solved in ?^?(min{2.85^k ? 8.54^?, 2.7^k ? 36.61^?, 3^k ? 1.78^?})
Search-Space Reduction via Essential Vertices
We investigate preprocessing for vertex-subset problems on graphs. While the notion of kernelization, originating in parameterized complexity theory, is a formalization of provably effective preprocessing aimed at reducing the total instance size, our focus is on finding a non-empty vertex set that belongs to an optimal solution. This decreases the size of the remaining part of the solution which still has to be found, and therefore shrinks the search space of fixed-parameter tractable algorithms for parameterizations based on the solution size. We introduce the notion of a c-essential vertex as one that is contained in all c-approximate solutions. For several classic combinatorial problems such as Odd Cycle Transversal and Directed Feedback Vertex Set, we show that under mild conditions a polynomial-time preprocessing algorithm can find a subset of an optimal solution that contains all 2-essential vertices, by exploiting packing/covering duality. This leads to FPT algorithms to solve these problems where the exponential term in the running time depends only on the number of non-essential vertices in the solution
Isolation Schemes for Problems on Decomposable Graphs
The Isolation Lemma of Mulmuley, Vazirani and Vazirani [Combinatorica'87] provides a self-reduction scheme that allows one to assume that a given instance of a problem has a unique solution, provided a solution exists at all. Since its introduction, much effort has been dedicated towards derandomization of the Isolation Lemma for specific classes of problems. So far, the focus was mainly on problems solvable in polynomial time. In this paper, we study a setting that is more typical for -complete problems, and obtain partial derandomizations in the form of significantly decreasing the number of required random bits. In particular, motivated by the advances in parameterized algorithms, we focus on problems on decomposable graphs. For example, for the problem of detecting a Hamiltonian cycle, we build upon the rank-based approach from [Bodlaender et al., Inf. Comput.'15] and design isolation schemes that use - random bits on graphs of treewidth at most ; - random bits on planar or -minor free graphs; and - -random bits on general graphs. In all these schemes, the weights are bounded exponentially in the number of random bits used. As a corollary, for every fixed we obtain an algorithm for detecting a Hamiltonian cycle in an -minor-free graph that runs in deterministic time and uses polynomial space; this is the first algorithm to achieve such complexity guarantees. For problems of more local nature, such as finding an independent set of maximum size, we obtain isolation schemes on graphs of treedepth at most that use random bits and assign polynomially-bounded weights. We also complement our findings with several unconditional and conditional lower bounds, which show that many of the results cannot be significantly improved
Detecting Feedback Vertex Sets of Size in Time
In the Feedback Vertex Set problem, one is given an undirected graph and
an integer , and one needs to determine whether there exists a set of
vertices that intersects all cycles of (a so-called feedback vertex set).
Feedback Vertex Set is one of the most central problems in parameterized
complexity: It served as an excellent test bed for many important algorithmic
techniques in the field such as Iterative Compression~[Guo et al. (JCSS'06)],
Randomized Branching~[Becker et al. (J. Artif. Intell. Res'00)] and
Cut\&Count~[Cygan et al. (FOCS'11)]. In particular, there has been a long race
for the smallest dependence in run times of the type ,
where the notation omits factors polynomial in . This race seemed
to be run in 2011, when a randomized algorithm time algorithm
based on Cut\&Count was introduced.
In this work, we show the contrary and give a time
randomized algorithm. Our algorithm combines all mentioned techniques with
substantial new ideas: First, we show that, given a feedback vertex set of size
of bounded average degree, a tree decomposition of width
can be found in polynomial time. Second, we give a randomized branching
strategy inspired by the one from~[Becker et al. (J. Artif. Intell. Res'00)] to
reduce to the aforementioned bounded average degree setting. Third, we obtain
significant run time improvements by employing fast matrix multiplication.Comment: SODA 2020, 22 page