405 research outputs found
Faster Existential FO Model Checking on Posets
We prove that the model checking problem for the existential fragment of
first-order (FO) logic on partially ordered sets is fixed-parameter tractable
(FPT) with respect to the formula and the width of a poset (the maximum size of
an antichain). While there is a long line of research into FO model checking on
graphs, the study of this problem on posets has been initiated just recently by
Bova, Ganian and Szeider (CSL-LICS 2014), who proved that the existential
fragment of FO has an FPT algorithm for a poset of fixed width. We improve upon
their result in two ways: (1) the runtime of our algorithm is
O(f(|{\phi}|,w).n^2) on n-element posets of width w, compared to O(g(|{\phi}|).
n^{h(w)}) of Bova et al., and (2) our proofs are simpler and easier to follow.
We complement this result by showing that, under a certain
complexity-theoretical assumption, the existential FO model checking problem
does not have a polynomial kernel.Comment: Paper as accepted to the LMCS journal. An extended abstract of an
earlier version of this paper has appeared at ISAAC'14. Main changes to the
previous version are improvements in the Multicoloured Clique part (Section
4
Cross-Composition: A New Technique for Kernelization Lower Bounds
We introduce a new technique for proving kernelization lower bounds, called
cross-composition. A classical problem L cross-composes into a parameterized
problem Q if an instance of Q with polynomially bounded parameter value can
express the logical OR of a sequence of instances of L. Building on work by
Bodlaender et al. (ICALP 2008) and using a result by Fortnow and Santhanam
(STOC 2008) we show that if an NP-complete problem cross-composes into a
parameterized problem Q then Q does not admit a polynomial kernel unless the
polynomial hierarchy collapses. Our technique generalizes and strengthens the
recent techniques of using OR-composition algorithms and of transferring the
lower bounds via polynomial parameter transformations. We show its
applicability by proving kernelization lower bounds for a number of important
graphs problems with structural (non-standard) parameterizations, e.g.,
Chromatic Number, Clique, and Weighted Feedback Vertex Set do not admit
polynomial kernels with respect to the vertex cover number of the input graphs
unless the polynomial hierarchy collapses, contrasting the fact that these
problems are trivially fixed-parameter tractable for this parameter. We have
similar lower bounds for Feedback Vertex Set.Comment: Updated information based on final version submitted to STACS 201
Structurally Parameterized d-Scattered Set
In -Scattered Set we are given an (edge-weighted) graph and are asked to
select at least vertices, so that the distance between any pair is at least
, thus generalizing Independent Set. We provide upper and lower bounds on
the complexity of this problem with respect to various standard graph
parameters. In particular, we show the following:
- For any , an -time algorithm, where
is the treewidth of the input graph.
- A tight SETH-based lower bound matching this algorithm's performance. These
generalize known results for Independent Set.
- -Scattered Set is W[1]-hard parameterized by vertex cover (for
edge-weighted graphs), or feedback vertex set (for unweighted graphs), even if
is an additional parameter.
- A single-exponential algorithm parameterized by vertex cover for unweighted
graphs, complementing the above-mentioned hardness.
- A -time algorithm parameterized by tree-depth
(), as well as a matching ETH-based lower bound, both for
unweighted graphs.
We complement these mostly negative results by providing an FPT approximation
scheme parameterized by treewidth. In particular, we give an algorithm which,
for any error parameter , runs in time
and returns a
-scattered set of size , if a -scattered set of the same
size exists
Kernelization Lower Bounds By Cross-Composition
We introduce the cross-composition framework for proving kernelization lower
bounds. A classical problem L AND/OR-cross-composes into a parameterized
problem Q if it is possible to efficiently construct an instance of Q with
polynomially bounded parameter value that expresses the logical AND or OR of a
sequence of instances of L. Building on work by Bodlaender et al. (ICALP 2008)
and using a result by Fortnow and Santhanam (STOC 2008) with a refinement by
Dell and van Melkebeek (STOC 2010), we show that if an NP-hard problem
OR-cross-composes into a parameterized problem Q then Q does not admit a
polynomial kernel unless NP \subseteq coNP/poly and the polynomial hierarchy
collapses. Similarly, an AND-cross-composition for Q rules out polynomial
kernels for Q under Bodlaender et al.'s AND-distillation conjecture.
Our technique generalizes and strengthens the recent techniques of using
composition algorithms and of transferring the lower bounds via polynomial
parameter transformations. We show its applicability by proving kernelization
lower bounds for a number of important graphs problems with structural
(non-standard) parameterizations, e.g., Clique, Chromatic Number, Weighted
Feedback Vertex Set, and Weighted Odd Cycle Transversal do not admit polynomial
kernels with respect to the vertex cover number of the input graphs unless the
polynomial hierarchy collapses, contrasting the fact that these problems are
trivially fixed-parameter tractable for this parameter.
After learning of our results, several teams of authors have successfully
applied the cross-composition framework to different parameterized problems.
For completeness, our presentation of the framework includes several extensions
based on this follow-up work. For example, we show how a relaxed version of
OR-cross-compositions may be used to give lower bounds on the degree of the
polynomial in the kernel size.Comment: A preliminary version appeared in the proceedings of the 28th
International Symposium on Theoretical Aspects of Computer Science (STACS
2011) under the title "Cross-Composition: A New Technique for Kernelization
Lower Bounds". Several results have been strengthened compared to the
preliminary version (http://arxiv.org/abs/1011.4224). 29 pages, 2 figure
Polynomial Kernels for Weighted Problems
Kernelization is a formalization of efficient preprocessing for NP-hard
problems using the framework of parameterized complexity. Among open problems
in kernelization it has been asked many times whether there are deterministic
polynomial kernelizations for Subset Sum and Knapsack when parameterized by the
number of items.
We answer both questions affirmatively by using an algorithm for compressing
numbers due to Frank and Tardos (Combinatorica 1987). This result had been
first used by Marx and V\'egh (ICALP 2013) in the context of kernelization. We
further illustrate its applicability by giving polynomial kernels also for
weighted versions of several well-studied parameterized problems. Furthermore,
when parameterized by the different item sizes we obtain a polynomial
kernelization for Subset Sum and an exponential kernelization for Knapsack.
Finally, we also obtain kernelization results for polynomial integer programs
An FPT 2-Approximation for Tree-Cut Decomposition
The tree-cut width of a graph is a graph parameter defined by Wollan [J.
Comb. Theory, Ser. B, 110:47-66, 2015] with the help of tree-cut
decompositions. In certain cases, tree-cut width appears to be more adequate
than treewidth as an invariant that, when bounded, can accelerate the
resolution of intractable problems. While designing algorithms for problems
with bounded tree-cut width, it is important to have a parametrically tractable
way to compute the exact value of this parameter or, at least, some constant
approximation of it. In this paper we give a parameterized 2-approximation
algorithm for the computation of tree-cut width; for an input -vertex graph
and an integer , our algorithm either confirms that the tree-cut width
of is more than or returns a tree-cut decomposition of certifying
that its tree-cut width is at most , in time .
Prior to this work, no constructive parameterized algorithms, even approximated
ones, existed for computing the tree-cut width of a graph. As a consequence of
the Graph Minors series by Robertson and Seymour, only the existence of a
decision algorithm was known.Comment: 17 pages, 3 figure
Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter
Kernelization is a concept that enables the formal mathematical analysis of data reduction through the framework of parameterized complexity. Intensive research into the Vertex Cover problem has shown that there is a preprocessing algorithm which given an instance (G,k) of Vertex Cover outputs an equivalent instance (G\u27,k\u27) in polynomial time with the guarantee that G\u27 has at most 2k\u27 vertices (and thus O((k\u27)^2) edges) with k\u27 <= k. Using the terminology of parameterized complexity we say that k-Vertex Cover has a kernel with 2k vertices. There is complexity-theoretic evidence that both 2k vertices and Theta(k^2) edges are optimal for the kernel size. In this paper we consider the Vertex Cover problem with a different parameter, the size fvs(G) of a minimum feedback vertex set for G. This refined parameter is structurally smaller than the parameter k associated to the vertex covering number VC(G) since fvs(G) <= VC(G) and the difference can be arbitrarily large. We give a kernel for Vertex Cover with a number of vertices that is cubic in fvs(G): an instance (G,X,k) of Vertex Cover, where X is a feedback vertex set for G, can be transformed in polynomial time into an equivalent instance (G\u27,X\u27,k\u27) such that k\u27 <= k, |X\u27| <= |X| and most importantly |V(G\u27)| <= 2k and |V(G\u27)| in O(|X\u27|^3). A similar result holds when the feedback vertex set X is not given along with the input. In sharp contrast we show that the Weighted Vertex Cover problem does not have polynomial kernel when parameterized by fvs(G) unless the polynomial hierarchy collapses to the third level (PH=Sigma_3^p). Our work is one of the first examples of research in kernelization using a non-standard parameter, and shows that this approach can yield interesting computational insights. To obtain our results we make extensive use of the combinatorial structure of independent sets in forests
Bidimensionality of Geometric Intersection Graphs
Let B be a finite collection of geometric (not necessarily convex) bodies in
the plane. Clearly, this class of geometric objects naturally generalizes the
class of disks, lines, ellipsoids, and even convex polygons. We consider
geometric intersection graphs GB where each body of the collection B is
represented by a vertex, and two vertices of GB are adjacent if the
intersection of the corresponding bodies is non-empty. For such graph classes
and under natural restrictions on their maximum degree or subgraph exclusion,
we prove that the relation between their treewidth and the maximum size of a
grid minor is linear. These combinatorial results vastly extend the
applicability of all the meta-algorithmic results of the bidimensionality
theory to geometrically defined graph classes
- …