4,439 research outputs found
A Discharging Method: Improved Kernels for Edge Triangle Packing and Covering
\textsc{Edge Triangle Packing} and \textsc{Edge Triangle Covering} are dual
problems extensively studied in the field of parameterized complexity.
Given a graph and an integer , \textsc{Edge Triangle Packing} seeks to
determine whether there exists a set of at least edge-disjoint triangles in
,
while \textsc{Edge Triangle Covering} aims to find out whether there exists a
set of at most edges that intersects all triangles in .
Previous research has shown that \textsc{Edge Triangle Packing} has a kernel
of vertices, while \textsc{Edge Triangle Covering} has a kernel
of vertices.
In this paper, we show that the two problems allow kernels of vertices,
improving all previous results. A significant contribution of our work is the
utilization of a novel discharging method for analyzing kernel size, which
exhibits potential for analyzing other kernel algorithms
Unified almost linear kernels for generalized covering and packing problems on nowhere dense classes
Let be a family of graphs, and let be nonnegative
integers. The \textsc{-Covering} problem asks whether for a
graph and an integer , there exists a set of at most vertices in
such that has no induced subgraph isomorphic to a
graph in , where is the -th power of . The
\textsc{-Packing} problem asks whether for a graph and
an integer , has induced subgraphs such that each
is isomorphic to a graph in , and for distinct , the distance between and in is larger than
.
We show that for every fixed nonnegative integers and every fixed
nonempty finite family of connected graphs, the
\textsc{-Covering} problem with and the
\textsc{-Packing} problem with
admit almost linear kernels on every nowhere dense class of graphs, and admit
linear kernels on every class of graphs with bounded expansion, parameterized
by the solution size . We obtain the same kernels for their annotated
variants. As corollaries, we prove that \textsc{Distance- Vertex Cover},
\textsc{Distance- Matching}, \textsc{-Free Vertex Deletion},
and \textsc{Induced--Packing} for any fixed finite family
of connected graphs admit almost linear kernels on every nowhere
dense class of graphs and linear kernels on every class of graphs with bounded
expansion. Our results extend the results for \textsc{Distance- Dominating
Set} by Drange et al. (STACS 2016) and Eickmeyer et al. (ICALP 2017), and the
result for \textsc{Distance- Independent Set} by Pilipczuk and Siebertz (EJC
2021).Comment: 38 page
Bidimensionality and EPTAS
Bidimensionality theory is a powerful framework for the development of
metaalgorithmic techniques. It was introduced by Demaine et al. as a tool to
obtain sub-exponential time parameterized algorithms for problems on H-minor
free graphs. Demaine and Hajiaghayi extended the theory to obtain PTASs for
bidimensional problems, and subsequently improved these results to EPTASs.
Fomin et. al related the theory to the existence of linear kernels for
parameterized problems. In this paper we revisit bidimensionality theory from
the perspective of approximation algorithms and redesign the framework for
obtaining EPTASs to be more powerful, easier to apply and easier to understand.
Two of the most widely used approaches to obtain PTASs on planar graphs are
the Lipton-Tarjan separator based approach, and Baker's approach. Demaine and
Hajiaghayi strengthened both approaches using bidimensionality and obtained
EPTASs for a multitude of problems. We unify the two strenghtened approaches to
combine the best of both worlds. At the heart of our framework is a
decomposition lemma which states that for "most" bidimensional problems, there
is a polynomial time algorithm which given an H-minor-free graph G as input and
an e > 0 outputs a vertex set X of size e * OPT such that the treewidth of G n
X is f(e). Here, OPT is the objective function value of the problem in question
and f is a function depending only on e. This allows us to obtain EPTASs on
(apex)-minor-free graphs for all problems covered by the previous framework, as
well as for a wide range of packing problems, partial covering problems and
problems that are neither closed under taking minors, nor contractions. To the
best of our knowledge for many of these problems including cycle packing,
vertex-h-packing, maximum leaf spanning tree, and partial r-dominating set no
EPTASs on planar graphs were previously known
On Polynomial Kernels for Integer Linear Programs: Covering, Packing and Feasibility
We study the existence of polynomial kernels for the problem of deciding
feasibility of integer linear programs (ILPs), and for finding good solutions
for covering and packing ILPs. Our main results are as follows: First, we show
that the ILP Feasibility problem admits no polynomial kernelization when
parameterized by both the number of variables and the number of constraints,
unless NP \subseteq coNP/poly. This extends to the restricted cases of bounded
variable degree and bounded number of variables per constraint, and to covering
and packing ILPs. Second, we give a polynomial kernelization for the Cover ILP
problem, asking for a solution to Ax >= b with c^Tx <= k, parameterized by k,
when A is row-sparse; this generalizes a known polynomial kernelization for the
special case with 0/1-variables and coefficients (d-Hitting Set)
Polynomial Kernels for Weighted Problems
Kernelization is a formalization of efficient preprocessing for NP-hard
problems using the framework of parameterized complexity. Among open problems
in kernelization it has been asked many times whether there are deterministic
polynomial kernelizations for Subset Sum and Knapsack when parameterized by the
number of items.
We answer both questions affirmatively by using an algorithm for compressing
numbers due to Frank and Tardos (Combinatorica 1987). This result had been
first used by Marx and V\'egh (ICALP 2013) in the context of kernelization. We
further illustrate its applicability by giving polynomial kernels also for
weighted versions of several well-studied parameterized problems. Furthermore,
when parameterized by the different item sizes we obtain a polynomial
kernelization for Subset Sum and an exponential kernelization for Knapsack.
Finally, we also obtain kernelization results for polynomial integer programs
Dagstuhl Reports : Volume 1, Issue 2, February 2011
Online Privacy: Towards Informational Self-Determination on the Internet (Dagstuhl Perspectives Workshop 11061) : Simone Fischer-Hübner, Chris Hoofnagle, Kai Rannenberg, Michael Waidner, Ioannis Krontiris and Michael Marhöfer Self-Repairing Programs (Dagstuhl Seminar 11062) : Mauro Pezzé, Martin C. Rinard, Westley Weimer and Andreas Zeller Theory and Applications of Graph Searching Problems (Dagstuhl Seminar 11071) : Fedor V. Fomin, Pierre Fraigniaud, Stephan Kreutzer and Dimitrios M. Thilikos Combinatorial and Algorithmic Aspects of Sequence Processing (Dagstuhl Seminar 11081) : Maxime Crochemore, Lila Kari, Mehryar Mohri and Dirk Nowotka Packing and Scheduling Algorithms for Information and Communication Services (Dagstuhl Seminar 11091) Klaus Jansen, Claire Mathieu, Hadas Shachnai and Neal E. Youn
Lossy Kernelization
In this paper we propose a new framework for analyzing the performance of
preprocessing algorithms. Our framework builds on the notion of kernelization
from parameterized complexity. However, as opposed to the original notion of
kernelization, our definitions combine well with approximation algorithms and
heuristics. The key new definition is that of a polynomial size
-approximate kernel. Loosely speaking, a polynomial size
-approximate kernel is a polynomial time pre-processing algorithm that
takes as input an instance to a parameterized problem, and outputs
another instance to the same problem, such that . Additionally, for every , a -approximate solution
to the pre-processed instance can be turned in polynomial time into a
-approximate solution to the original instance .
Our main technical contribution are -approximate kernels of
polynomial size for three problems, namely Connected Vertex Cover, Disjoint
Cycle Packing and Disjoint Factors. These problems are known not to admit any
polynomial size kernels unless . Our approximate
kernels simultaneously beat both the lower bounds on the (normal) kernel size,
and the hardness of approximation lower bounds for all three problems. On the
negative side we prove that Longest Path parameterized by the length of the
path and Set Cover parameterized by the universe size do not admit even an
-approximate kernel of polynomial size, for any , unless
. In order to prove this lower bound we need to combine
in a non-trivial way the techniques used for showing kernelization lower bounds
with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and
approximate kernel lower bounds for Set Cover and Hitting Set parameterized
by universe siz
- …