4,439 research outputs found

    A Discharging Method: Improved Kernels for Edge Triangle Packing and Covering

    Full text link
    \textsc{Edge Triangle Packing} and \textsc{Edge Triangle Covering} are dual problems extensively studied in the field of parameterized complexity. Given a graph GG and an integer kk, \textsc{Edge Triangle Packing} seeks to determine whether there exists a set of at least kk edge-disjoint triangles in GG, while \textsc{Edge Triangle Covering} aims to find out whether there exists a set of at most kk edges that intersects all triangles in GG. Previous research has shown that \textsc{Edge Triangle Packing} has a kernel of (3+ϵ)k(3+\epsilon)k vertices, while \textsc{Edge Triangle Covering} has a kernel of 6k6k vertices. In this paper, we show that the two problems allow kernels of 3k3k vertices, improving all previous results. A significant contribution of our work is the utilization of a novel discharging method for analyzing kernel size, which exhibits potential for analyzing other kernel algorithms

    Unified almost linear kernels for generalized covering and packing problems on nowhere dense classes

    Full text link
    Let F\mathcal{F} be a family of graphs, and let p,rp,r be nonnegative integers. The \textsc{(p,r,F)(p,r,\mathcal{F})-Covering} problem asks whether for a graph GG and an integer kk, there exists a set DD of at most kk vertices in GG such that Gp∖NGr[D]G^p\setminus N_G^r[D] has no induced subgraph isomorphic to a graph in F\mathcal{F}, where GpG^p is the pp-th power of GG. The \textsc{(p,r,F)(p,r,\mathcal{F})-Packing} problem asks whether for a graph GG and an integer kk, GpG^p has kk induced subgraphs H1,…,HkH_1,\ldots,H_k such that each HiH_i is isomorphic to a graph in F\mathcal{F}, and for distinct i,j∈{1,…,k}i,j\in \{1, \ldots, k\}, the distance between V(Hi)V(H_i) and V(Hj)V(H_j) in GG is larger than rr. We show that for every fixed nonnegative integers p,rp,r and every fixed nonempty finite family F\mathcal{F} of connected graphs, the \textsc{(p,r,F)(p,r,\mathcal{F})-Covering} problem with p≤2r+1p\leq2r+1 and the \textsc{(p,r,F)(p,r,\mathcal{F})-Packing} problem with p≤2⌊r/2⌋+1p\leq2\lfloor r/2\rfloor+1 admit almost linear kernels on every nowhere dense class of graphs, and admit linear kernels on every class of graphs with bounded expansion, parameterized by the solution size kk. We obtain the same kernels for their annotated variants. As corollaries, we prove that \textsc{Distance-rr Vertex Cover}, \textsc{Distance-rr Matching}, \textsc{F\mathcal{F}-Free Vertex Deletion}, and \textsc{Induced-F\mathcal{F}-Packing} for any fixed finite family F\mathcal{F} of connected graphs admit almost linear kernels on every nowhere dense class of graphs and linear kernels on every class of graphs with bounded expansion. Our results extend the results for \textsc{Distance-rr Dominating Set} by Drange et al. (STACS 2016) and Eickmeyer et al. (ICALP 2017), and the result for \textsc{Distance-rr Independent Set} by Pilipczuk and Siebertz (EJC 2021).Comment: 38 page

    Bidimensionality and EPTAS

    Full text link
    Bidimensionality theory is a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. as a tool to obtain sub-exponential time parameterized algorithms for problems on H-minor free graphs. Demaine and Hajiaghayi extended the theory to obtain PTASs for bidimensional problems, and subsequently improved these results to EPTASs. Fomin et. al related the theory to the existence of linear kernels for parameterized problems. In this paper we revisit bidimensionality theory from the perspective of approximation algorithms and redesign the framework for obtaining EPTASs to be more powerful, easier to apply and easier to understand. Two of the most widely used approaches to obtain PTASs on planar graphs are the Lipton-Tarjan separator based approach, and Baker's approach. Demaine and Hajiaghayi strengthened both approaches using bidimensionality and obtained EPTASs for a multitude of problems. We unify the two strenghtened approaches to combine the best of both worlds. At the heart of our framework is a decomposition lemma which states that for "most" bidimensional problems, there is a polynomial time algorithm which given an H-minor-free graph G as input and an e > 0 outputs a vertex set X of size e * OPT such that the treewidth of G n X is f(e). Here, OPT is the objective function value of the problem in question and f is a function depending only on e. This allows us to obtain EPTASs on (apex)-minor-free graphs for all problems covered by the previous framework, as well as for a wide range of packing problems, partial covering problems and problems that are neither closed under taking minors, nor contractions. To the best of our knowledge for many of these problems including cycle packing, vertex-h-packing, maximum leaf spanning tree, and partial r-dominating set no EPTASs on planar graphs were previously known

    On Polynomial Kernels for Integer Linear Programs: Covering, Packing and Feasibility

    Full text link
    We study the existence of polynomial kernels for the problem of deciding feasibility of integer linear programs (ILPs), and for finding good solutions for covering and packing ILPs. Our main results are as follows: First, we show that the ILP Feasibility problem admits no polynomial kernelization when parameterized by both the number of variables and the number of constraints, unless NP \subseteq coNP/poly. This extends to the restricted cases of bounded variable degree and bounded number of variables per constraint, and to covering and packing ILPs. Second, we give a polynomial kernelization for the Cover ILP problem, asking for a solution to Ax >= b with c^Tx <= k, parameterized by k, when A is row-sparse; this generalizes a known polynomial kernelization for the special case with 0/1-variables and coefficients (d-Hitting Set)

    Polynomial Kernels for Weighted Problems

    Full text link
    Kernelization is a formalization of efficient preprocessing for NP-hard problems using the framework of parameterized complexity. Among open problems in kernelization it has been asked many times whether there are deterministic polynomial kernelizations for Subset Sum and Knapsack when parameterized by the number nn of items. We answer both questions affirmatively by using an algorithm for compressing numbers due to Frank and Tardos (Combinatorica 1987). This result had been first used by Marx and V\'egh (ICALP 2013) in the context of kernelization. We further illustrate its applicability by giving polynomial kernels also for weighted versions of several well-studied parameterized problems. Furthermore, when parameterized by the different item sizes we obtain a polynomial kernelization for Subset Sum and an exponential kernelization for Knapsack. Finally, we also obtain kernelization results for polynomial integer programs

    Dagstuhl Reports : Volume 1, Issue 2, February 2011

    Get PDF
    Online Privacy: Towards Informational Self-Determination on the Internet (Dagstuhl Perspectives Workshop 11061) : Simone Fischer-Hübner, Chris Hoofnagle, Kai Rannenberg, Michael Waidner, Ioannis Krontiris and Michael Marhöfer Self-Repairing Programs (Dagstuhl Seminar 11062) : Mauro Pezzé, Martin C. Rinard, Westley Weimer and Andreas Zeller Theory and Applications of Graph Searching Problems (Dagstuhl Seminar 11071) : Fedor V. Fomin, Pierre Fraigniaud, Stephan Kreutzer and Dimitrios M. Thilikos Combinatorial and Algorithmic Aspects of Sequence Processing (Dagstuhl Seminar 11081) : Maxime Crochemore, Lila Kari, Mehryar Mohri and Dirk Nowotka Packing and Scheduling Algorithms for Information and Communication Services (Dagstuhl Seminar 11091) Klaus Jansen, Claire Mathieu, Hadas Shachnai and Neal E. Youn

    Lossy Kernelization

    Get PDF
    In this paper we propose a new framework for analyzing the performance of preprocessing algorithms. Our framework builds on the notion of kernelization from parameterized complexity. However, as opposed to the original notion of kernelization, our definitions combine well with approximation algorithms and heuristics. The key new definition is that of a polynomial size α\alpha-approximate kernel. Loosely speaking, a polynomial size α\alpha-approximate kernel is a polynomial time pre-processing algorithm that takes as input an instance (I,k)(I,k) to a parameterized problem, and outputs another instance (I′,k′)(I',k') to the same problem, such that ∣I′∣+k′≤kO(1)|I'|+k' \leq k^{O(1)}. Additionally, for every c≥1c \geq 1, a cc-approximate solution s′s' to the pre-processed instance (I′,k′)(I',k') can be turned in polynomial time into a (c⋅α)(c \cdot \alpha)-approximate solution ss to the original instance (I,k)(I,k). Our main technical contribution are α\alpha-approximate kernels of polynomial size for three problems, namely Connected Vertex Cover, Disjoint Cycle Packing and Disjoint Factors. These problems are known not to admit any polynomial size kernels unless NP⊆coNP/polyNP \subseteq coNP/poly. Our approximate kernels simultaneously beat both the lower bounds on the (normal) kernel size, and the hardness of approximation lower bounds for all three problems. On the negative side we prove that Longest Path parameterized by the length of the path and Set Cover parameterized by the universe size do not admit even an α\alpha-approximate kernel of polynomial size, for any α≥1\alpha \geq 1, unless NP⊆coNP/polyNP \subseteq coNP/poly. In order to prove this lower bound we need to combine in a non-trivial way the techniques used for showing kernelization lower bounds with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and approximate kernel lower bounds for Set Cover and Hitting Set parameterized by universe siz
    • …
    corecore