320 research outputs found

    Linear-vertex kernel for the problem of packing r-stars into a graph without long induced paths

    Get PDF
    Let integers r2r\ge 2 and d3d\ge 3 be fixed. Let Gd{\cal G}_d be the set of graphs with no induced path on dd vertices. We study the problem of packing kk vertex-disjoint copies of K1,rK_{1,r} (k2k\ge 2) into a graph GG from parameterized preprocessing, i.e., kernelization, point of view. We show that every graph GGdG\in {\cal G}_d can be reduced, in polynomial time, to a graph GGdG'\in {\cal G}_d with O(k)O(k) vertices such that GG has at least kk vertex-disjoint copies of K1,rK_{1,r} if and only if GG' has. Such a result is known for arbitrary graphs GG when r=2r=2 and we conjecture that it holds for every r2r\ge 2

    Kernelization and Parameterized Algorithms for 3-Path Vertex Cover

    Full text link
    A 3-path vertex cover in a graph is a vertex subset CC such that every path of three vertices contains at least one vertex from CC. The parameterized 3-path vertex cover problem asks whether a graph has a 3-path vertex cover of size at most kk. In this paper, we give a kernel of 5k5k vertices and an O(1.7485k)O^*(1.7485^k)-time and polynomial-space algorithm for this problem, both new results improve previous known bounds.Comment: in TAMC 2016, LNCS 9796, 201

    Data Reductions and Combinatorial Bounds for Improved Approximation Algorithms

    Full text link
    Kernelization algorithms in the context of Parameterized Complexity are often based on a combination of reduction rules and combinatorial insights. We will expose in this paper a similar strategy for obtaining polynomial-time approximation algorithms. Our method features the use of approximation-preserving reductions, akin to the notion of parameterized reductions. We exemplify this method to obtain the currently best approximation algorithms for \textsc{Harmless Set}, \textsc{Differential} and \textsc{Multiple Nonblocker}, all of them can be considered in the context of securing networks or information propagation

    Characterizing the easy-to-find subgraphs from the viewpoint of polynomial-time algorithms, kernels, and Turing kernels

    Full text link
    We study two fundamental problems related to finding subgraphs: (1) given graphs G and H, Subgraph Test asks if H is isomorphic to a subgraph of G, (2) given graphs G, H, and an integer t, Packing asks if G contains t vertex-disjoint subgraphs isomorphic to H. For every graph class F, let F-Subgraph Test and F-Packing be the special cases of the two problems where H is restricted to be in F. Our goal is to study which classes F make the two problems tractable in one of the following senses: * (randomized) polynomial-time solvable, * admits a polynomial (many-one) kernel, or * admits a polynomial Turing kernel (that is, has an adaptive polynomial-time procedure that reduces the problem to a polynomial number of instances, each of which has size bounded polynomially by the size of the solution). We identify a simple combinatorial property such that if a hereditary class F has this property, then F-Packing admits a polynomial kernel, and has no polynomial (many-one) kernel otherwise, unless the polynomial hierarchy collapses. Furthermore, if F does not have this property, then F-Packing is either WK[1]-hard, W[1]-hard, or Long Path-hard, giving evidence that it does not admit polynomial Turing kernels either. For F-Subgraph Test, we show that if every graph of a hereditary class F satisfies the property that it is possible to delete a bounded number of vertices such that every remaining component has size at most two, then F-Subgraph Test is solvable in randomized polynomial time and it is NP-hard otherwise. We introduce a combinatorial property called (a,b,c,d)-splittability and show that if every graph in a hereditary class F has this property, then F-Subgraph Test admits a polynomial Turing kernel and it is WK[1]-hard, W[1]-hard, or Long Path-hard, otherwise.Comment: 69 pages, extended abstract to appear in the proceedings of SODA 201

    Parameterized Algorithms for Zero Extension and Metric Labelling Problems

    Get PDF
    We consider the problems Zero Extension and Metric Labelling under the paradigm of parameterized complexity. These are natural, well-studied problems with important applications, but have previously not received much attention from this area. Depending on the chosen cost function mu, we find that different algorithmic approaches can be applied to design FPT-algorithms: for arbitrary mu we parameterize by the number of edges that cross the cut (not the cost) and show how to solve Zero Extension in time O(|D|^{O(k^2)} n^4 log n) using randomized contractions. We improve this running time with respect to both parameter and input size to O(|D|^{O(k)} m) in the case where mu is a metric. We further show that the problem admits a polynomial sparsifier, that is, a kernel of size O(k^{|D|+1}) that is independent of the metric mu. With the stronger condition that mu is described by the distances of leaves in a tree, we parameterize by a gap parameter (q - p) between the cost of a true solution q and a `discrete relaxation\u27 p and achieve a running time of O(|D|^{q-p} |T|m + |T|phi(n,m)) where T is the size of the tree over which mu is defined and phi(n,m) is the running time of a max-flow computation. We achieve a similar result for the more general Metric Labelling, while also allowing mu to be the distance metric between an arbitrary subset of nodes in a tree using tools from the theory of VCSPs. We expect the methods used in the latter result to have further applications

    Preprocessing Subgraph and Minor Problems: When Does a Small Vertex Cover Help?

    Full text link
    We prove a number of results around kernelization of problems parameterized by the size of a given vertex cover of the input graph. We provide three sets of simple general conditions characterizing problems admitting kernels of polynomial size. Our characterizations not only give generic explanations for the existence of many known polynomial kernels for problems like q-Coloring, Odd Cycle Transversal, Chordal Deletion, Eta Transversal, or Long Path, parameterized by the size of a vertex cover, but also imply new polynomial kernels for problems like F-Minor-Free Deletion, which is to delete at most k vertices to obtain a graph with no minor from a fixed finite set F. While our characterization captures many interesting problems, the kernelization complexity landscape of parameterizations by vertex cover is much more involved. We demonstrate this by several results about induced subgraph and minor containment testing, which we find surprising. While it was known that testing for an induced complete subgraph has no polynomial kernel unless NP is in coNP/poly, we show that the problem of testing if a graph contains a complete graph on t vertices as a minor admits a polynomial kernel. On the other hand, it was known that testing for a path on t vertices as a minor admits a polynomial kernel, but we show that testing for containment of an induced path on t vertices is unlikely to admit a polynomial kernel.Comment: To appear in the Journal of Computer and System Science
    corecore