102 research outputs found

    Lossy Kernelization

    Get PDF
    In this paper we propose a new framework for analyzing the performance of preprocessing algorithms. Our framework builds on the notion of kernelization from parameterized complexity. However, as opposed to the original notion of kernelization, our definitions combine well with approximation algorithms and heuristics. The key new definition is that of a polynomial size α\alpha-approximate kernel. Loosely speaking, a polynomial size α\alpha-approximate kernel is a polynomial time pre-processing algorithm that takes as input an instance (I,k)(I,k) to a parameterized problem, and outputs another instance (I′,k′)(I',k') to the same problem, such that ∣I′∣+k′≤kO(1)|I'|+k' \leq k^{O(1)}. Additionally, for every c≥1c \geq 1, a cc-approximate solution s′s' to the pre-processed instance (I′,k′)(I',k') can be turned in polynomial time into a (c⋅α)(c \cdot \alpha)-approximate solution ss to the original instance (I,k)(I,k). Our main technical contribution are α\alpha-approximate kernels of polynomial size for three problems, namely Connected Vertex Cover, Disjoint Cycle Packing and Disjoint Factors. These problems are known not to admit any polynomial size kernels unless NP⊆coNP/polyNP \subseteq coNP/poly. Our approximate kernels simultaneously beat both the lower bounds on the (normal) kernel size, and the hardness of approximation lower bounds for all three problems. On the negative side we prove that Longest Path parameterized by the length of the path and Set Cover parameterized by the universe size do not admit even an α\alpha-approximate kernel of polynomial size, for any α≥1\alpha \geq 1, unless NP⊆coNP/polyNP \subseteq coNP/poly. In order to prove this lower bound we need to combine in a non-trivial way the techniques used for showing kernelization lower bounds with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and approximate kernel lower bounds for Set Cover and Hitting Set parameterized by universe siz

    Parameterized Approximation Schemes for Steiner Trees with Small Number of Steiner Vertices

    Get PDF
    We study the Steiner Tree problem, in which a set of terminal vertices needs to be connected in the cheapest possible way in an edge-weighted graph. This problem has been extensively studied from the viewpoint of approximation and also parametrization. In particular, on one hand Steiner Tree is known to be APX-hard, and W[2]-hard on the other, if parameterized by the number of non-terminals (Steiner vertices) in the optimum solution. In contrast to this we give an efficient parameterized approximation scheme (EPAS), which circumvents both hardness results. Moreover, our methods imply the existence of a polynomial size approximate kernelization scheme (PSAKS) for the considered parameter. We further study the parameterized approximability of other variants of Steiner Tree, such as Directed Steiner Tree and Steiner Forest. For neither of these an EPAS is likely to exist for the studied parameter: for Steiner Forest an easy observation shows that the problem is APX-hard, even if the input graph contains no Steiner vertices. For Directed Steiner Tree we prove that approximating within any function of the studied parameter is W[1]-hard. Nevertheless, we show that an EPAS exists for Unweighted Directed Steiner Tree, but a PSAKS does not. We also prove that there is an EPAS and a PSAKS for Steiner Forest if in addition to the number of Steiner vertices, the number of connected components of an optimal solution is considered to be a parameter.Comment: 23 pages, 6 figures An extended abstract appeared in proceedings of STACS 201

    An Approximate Kernel for Connected Feedback Vertex Set

    Get PDF
    The Feedback Vertex Set problem is a fundamental computational problem which has been the subject of intensive study in various domains of algorithmics. In this problem, one is given an undirected graph G and an integer k as input. The objective is to determine whether at most k vertices can be deleted from G such that the resulting graph is acyclic. The study of preprocessing algorithms for this problem has a long and rich history, culminating in the quadratic kernelization of Thomasse [SODA 2010]. However, it is known that when the solution is required to induce a connected subgraph (such a set is called a connected feedback vertex set), a polynomial kernelization is unlikely to exist and the problem is NP-hard to approximate below a factor of 2 (assuming the Unique Games Conjecture). In this paper, we show that if one is interested in only preserving approximate solutions (even of quality arbitrarily close to the optimum), then there is a drastic improvement in our ability to preprocess this problem. Specifically, we prove that for every fixed 0<epsilon<1, graph G, and k in N, the following holds: There is a polynomial time computable graph G\u27 of size k^O(1) such that for every c >= 1, any c-approximate connected feedback vertex set of G\u27 of size at most k is a c * (1+epsilon)-approximate connected feedback vertex set of G. Our result adds to the set of approximate kernelization algorithms introduced by Lokshtanov et al. [STOC 2017]. As a consequence of our main result, we show that Connected Feedback Vertex Set can be approximated within a factor min{OPT^O(1),n^(1-delta)} in polynomial time for some delta>0

    Lossy Kernelization for (Implicit) Hitting Set Problems

    Get PDF
    We re-visit the complexity of polynomial time pre-processing (kernelization) for the d-Hitting Set problem. This is one of the most classic problems in Parameterized Complexity by itself, and, furthermore, it encompasses several other of the most well-studied problems in this field, such as Vertex Cover, Feedback Vertex Set in Tournaments (FVST) and Cluster Vertex Deletion (CVD). In fact, d-Hitting Set encompasses any deletion problem to a hereditary property that can be characterized by a finite set of forbidden induced subgraphs. With respect to bit size, the kernelization complexity of d-Hitting Set is essentially settled: there exists a kernel with ?(k^d) bits (?(k^d) sets and ?(k^{d-1}) elements) and this it tight by the result of Dell and van Melkebeek [STOC 2010, JACM 2014]. Still, the question of whether there exists a kernel for d-Hitting Set with fewer elements has remained one of the most major open problems in Kernelization. In this paper, we first show that if we allow the kernelization to be lossy with a qualitatively better loss than the best possible approximation ratio of polynomial time approximation algorithms, then one can obtain kernels where the number of elements is linear for every fixed d. Further, based on this, we present our main result: we show that there exist approximate Turing kernelizations for d-Hitting Set that even beat the established bit-size lower bounds for exact kernelizations - in fact, we use a constant number of oracle calls, each with "near linear" (?(k^{1+?})) bit size, that is, almost the best one could hope for. Lastly, for two special cases of implicit 3-Hitting set, namely, FVST and CVD, we obtain the "best of both worlds" type of results - (1+?)-approximate kernelizations with a linear number of vertices. In terms of size, this substantially improves the exact kernels of Fomin et al. [SODA 2018, TALG 2019], with simpler arguments

    Lossy Kernels for Graph Contraction Problems

    Get PDF
    We study some well-known graph contraction problems in the recently introduced framework of lossy kernelization. In classical kernelization, given an instance (I,k) of a parameterized problem, we are interested in obtaining (in polynomial time) an equivalent instance (I\u27,k\u27) of the same problem whose size is bounded by a function in k. This notion however has a major limitation. Given an approximate solution to the instance (I\u27,k\u27), we can say nothing about the original instance (I,k). To handle this issue, among others, the framework of lossy kernelization was introduced. In this framework, for a constant alpha, given an instance (I,k) we obtain an instance (I\u27,k\u27) of the same problem such that, for every c>1, any c-approximate solution to (I\u27,k\u27) can be turned into a (c*alpha)-approximate solution to the original instance (I, k) in polynomial time. Naturally, we are interested in a polynomial time algorithm for this task, and further require that |I\u27| + k\u27 = k^{O(1)}. Akin to the notion of polynomial time approximation schemes in approximation algorithms, a parameterized problem is said to admit a polynomial size approximate kernelization scheme (PSAKS) if it admits a polynomial size alpha-approximate kernel for every approximation parameter alpha > 1. In this work, we design PSAKSs for Tree Contraction, Star Contraction, Out-Tree Contraction and Cactus Contraction problems. These problems do not admit polynomial kernels, and we show that each of them admit a PSAKS with running time k^{f(alpha)}|I|^{O(1)} that returns an instance of size k^{g(alpha)} where f(alpha) and g(alpha) are constants depending on alpha

    Lossy Kernelization of Same-Size Clustering

    Get PDF
    In this work, we study the k-median clustering problem with an additional equal-size constraint on the clusters from the perspective of parameterized preprocessing. Our main result is the first lossy (2-approximate) polynomial kernel for this problem parameterized by the cost of clustering. We complement this result by establishing lower bounds for the problem that eliminate the existence of an (exact) kernel of polynomial size and a PTAS

    Lossy Kernels for Connected Dominating Set on Sparse Graphs

    Get PDF
    For alpha > 1, an alpha-approximate (bi-)kernel for a problem Q is a polynomial-time algorithm that takes as input an instance (I, k) of Q and outputs an instance (I\u27,k\u27) (of a problem Q\u27) of size bounded by a function of k such that, for every c >= 1, a c-approximate solution for the new instance can be turned into a (c alpha)-approximate solution of the original instance in polynomial time. This framework of lossy kernelization was recently introduced by Lokshtanov et al. We study Connected Dominating Set (and its distance-r variant) parameterized by solution size on sparse graph classes like biclique-free graphs, classes of bounded expansion, and nowhere dense classes. We prove that for every alpha > 1, Connected Dominating Set admits a polynomial-size alpha-approximate (bi-)kernel on all the aforementioned classes. Our results are in sharp contrast to the kernelization complexity of Connected Dominating Set, which is known to not admit a polynomial kernel even on 2-degenerate graphs and graphs of bounded expansion, unless NP subseteq coNP/poly. We complement our results by the following conditional lower bound. We show that if a class C is somewhere dense and closed under taking subgraphs, then for some value of r in N there cannot exist an alpha-approximate bi-kernel for the (Connected) Distance-r Dominating Set problem on C for any alpha > 1 (assuming the Gap Exponential Time Hypothesis)
    • …
    corecore