2,479 research outputs found

    Bidimensionality and EPTAS

    Full text link
    Bidimensionality theory is a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. as a tool to obtain sub-exponential time parameterized algorithms for problems on H-minor free graphs. Demaine and Hajiaghayi extended the theory to obtain PTASs for bidimensional problems, and subsequently improved these results to EPTASs. Fomin et. al related the theory to the existence of linear kernels for parameterized problems. In this paper we revisit bidimensionality theory from the perspective of approximation algorithms and redesign the framework for obtaining EPTASs to be more powerful, easier to apply and easier to understand. Two of the most widely used approaches to obtain PTASs on planar graphs are the Lipton-Tarjan separator based approach, and Baker's approach. Demaine and Hajiaghayi strengthened both approaches using bidimensionality and obtained EPTASs for a multitude of problems. We unify the two strenghtened approaches to combine the best of both worlds. At the heart of our framework is a decomposition lemma which states that for "most" bidimensional problems, there is a polynomial time algorithm which given an H-minor-free graph G as input and an e > 0 outputs a vertex set X of size e * OPT such that the treewidth of G n X is f(e). Here, OPT is the objective function value of the problem in question and f is a function depending only on e. This allows us to obtain EPTASs on (apex)-minor-free graphs for all problems covered by the previous framework, as well as for a wide range of packing problems, partial covering problems and problems that are neither closed under taking minors, nor contractions. To the best of our knowledge for many of these problems including cycle packing, vertex-h-packing, maximum leaf spanning tree, and partial r-dominating set no EPTASs on planar graphs were previously known

    Dagstuhl Reports : Volume 1, Issue 2, February 2011

    Get PDF
    Online Privacy: Towards Informational Self-Determination on the Internet (Dagstuhl Perspectives Workshop 11061) : Simone Fischer-Hübner, Chris Hoofnagle, Kai Rannenberg, Michael Waidner, Ioannis Krontiris and Michael Marhöfer Self-Repairing Programs (Dagstuhl Seminar 11062) : Mauro Pezzé, Martin C. Rinard, Westley Weimer and Andreas Zeller Theory and Applications of Graph Searching Problems (Dagstuhl Seminar 11071) : Fedor V. Fomin, Pierre Fraigniaud, Stephan Kreutzer and Dimitrios M. Thilikos Combinatorial and Algorithmic Aspects of Sequence Processing (Dagstuhl Seminar 11081) : Maxime Crochemore, Lila Kari, Mehryar Mohri and Dirk Nowotka Packing and Scheduling Algorithms for Information and Communication Services (Dagstuhl Seminar 11091) Klaus Jansen, Claire Mathieu, Hadas Shachnai and Neal E. Youn

    Kernelization and Parameterized Algorithms for 3-Path Vertex Cover

    Full text link
    A 3-path vertex cover in a graph is a vertex subset CC such that every path of three vertices contains at least one vertex from CC. The parameterized 3-path vertex cover problem asks whether a graph has a 3-path vertex cover of size at most kk. In this paper, we give a kernel of 5k5k vertices and an O(1.7485k)O^*(1.7485^k)-time and polynomial-space algorithm for this problem, both new results improve previous known bounds.Comment: in TAMC 2016, LNCS 9796, 201

    Lossy Kernelization

    Get PDF
    In this paper we propose a new framework for analyzing the performance of preprocessing algorithms. Our framework builds on the notion of kernelization from parameterized complexity. However, as opposed to the original notion of kernelization, our definitions combine well with approximation algorithms and heuristics. The key new definition is that of a polynomial size α\alpha-approximate kernel. Loosely speaking, a polynomial size α\alpha-approximate kernel is a polynomial time pre-processing algorithm that takes as input an instance (I,k)(I,k) to a parameterized problem, and outputs another instance (I,k)(I',k') to the same problem, such that I+kkO(1)|I'|+k' \leq k^{O(1)}. Additionally, for every c1c \geq 1, a cc-approximate solution ss' to the pre-processed instance (I,k)(I',k') can be turned in polynomial time into a (cα)(c \cdot \alpha)-approximate solution ss to the original instance (I,k)(I,k). Our main technical contribution are α\alpha-approximate kernels of polynomial size for three problems, namely Connected Vertex Cover, Disjoint Cycle Packing and Disjoint Factors. These problems are known not to admit any polynomial size kernels unless NPcoNP/polyNP \subseteq coNP/poly. Our approximate kernels simultaneously beat both the lower bounds on the (normal) kernel size, and the hardness of approximation lower bounds for all three problems. On the negative side we prove that Longest Path parameterized by the length of the path and Set Cover parameterized by the universe size do not admit even an α\alpha-approximate kernel of polynomial size, for any α1\alpha \geq 1, unless NPcoNP/polyNP \subseteq coNP/poly. In order to prove this lower bound we need to combine in a non-trivial way the techniques used for showing kernelization lower bounds with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and approximate kernel lower bounds for Set Cover and Hitting Set parameterized by universe siz
    corecore