47 research outputs found

    Compression via Matroids: A Randomized Polynomial Kernel for Odd Cycle Transversal

    Full text link
    The Odd Cycle Transversal problem (OCT) asks whether a given graph can be made bipartite by deleting at most kk of its vertices. In a breakthrough result Reed, Smith, and Vetta (Operations Research Letters, 2004) gave a \BigOh(4^kkmn) time algorithm for it, the first algorithm with polynomial runtime of uniform degree for every fixed kk. It is known that this implies a polynomial-time compression algorithm that turns OCT instances into equivalent instances of size at most \BigOh(4^k), a so-called kernelization. Since then the existence of a polynomial kernel for OCT, i.e., a kernelization with size bounded polynomially in kk, has turned into one of the main open questions in the study of kernelization. This work provides the first (randomized) polynomial kernelization for OCT. We introduce a novel kernelization approach based on matroid theory, where we encode all relevant information about a problem instance into a matroid with a representation of size polynomial in kk. For OCT, the matroid is built to allow us to simulate the computation of the iterative compression step of the algorithm of Reed, Smith, and Vetta, applied (for only one round) to an approximate odd cycle transversal which it is aiming to shrink to size kk. The process is randomized with one-sided error exponentially small in kk, where the result can contain false positives but no false negatives, and the size guarantee is cubic in the size of the approximate solution. Combined with an \BigOh(\sqrt{\log n})-approximation (Agarwal et al., STOC 2005), we get a reduction of the instance to size \BigOh(k^{4.5}), implying a randomized polynomial kernelization.Comment: Minor changes to agree with SODA 2012 version of the pape

    Counting Problems in Parameterized Complexity

    Get PDF
    This survey is an invitation to parameterized counting problems for readers with a background in parameterized algorithms and complexity. After an introduction to the peculiarities of counting complexity, we survey the parameterized approach to counting problems, with a focus on two topics of recent interest: Counting small patterns in large graphs, and counting perfect matchings and Hamiltonian cycles in well-structured graphs. While this survey presupposes familiarity with parameterized algorithms and complexity, we aim at explaining all relevant notions from counting complexity in a self-contained way

    Approximate Counting of k-Paths: Deterministic and in Polynomial Space

    Get PDF
    A few years ago, Alon et al. [ISMB 2008] gave a simple randomized O((2e)^km epsilon^{-2})-time exponential-space algorithm to approximately compute the number of paths on k vertices in a graph G up to a multiplicative error of 1 +/- epsilon. Shortly afterwards, Alon and Gutner [IWPEC 2009, TALG 2010] gave a deterministic exponential-space algorithm with running time (2e)^{k+O(log^3k)}m log n whenever epsilon^{-1}=k^{O(1)}. Recently, Brand et al. [STOC 2018] provided a speed-up at the cost of reintroducing randomization. Specifically, they gave a randomized O(4^km epsilon^{-2})-time exponential-space algorithm. In this article, we revisit the algorithm by Alon and Gutner. We modify the foundation of their work, and with a novel twist, obtain the following results. - We present a deterministic 4^{k+O(sqrt{k}(log^2k+log^2 epsilon^{-1}))}m log n-time polynomial-space algorithm. This matches the running time of the best known deterministic polynomial-space algorithm for deciding whether a given graph G has a path on k vertices. - Additionally, we present a randomized 4^{k+O(log k(log k + log epsilon^{-1}))}m log n-time polynomial-space algorithm. While Brand et al. make non-trivial use of exterior algebra, our algorithm is very simple; we only make elementary use of the probabilistic method. Thus, the algorithm by Brand et al. runs in time 4^{k+o(k)}m whenever epsilon^{-1}=2^{o(k)}, while our deterministic and randomized algorithms run in time 4^{k+o(k)}m log n whenever epsilon^{-1}=2^{o(k^{1/4})} and epsilon^{-1}=2^{o(k/(log k))}, respectively. Prior to our work, no 2^{O(k)}n^{O(1)}-time polynomial-space algorithm was known. Additionally, our approach is embeddable in the classic framework of divide-and-color, hence it immediately extends to approximate counting of graphs of bounded treewidth; in comparison, Brand et al. note that their approach is limited to graphs of bounded pathwidth

    Tight Algorithms for Connectivity Problems Parameterized by Clique-Width

    Get PDF

    ETH-Tight Algorithms for Long Path and Cycle on Unit Disk Graphs

    Get PDF
    We present an algorithm for the extensively studied Long Path and Long Cycle problems on unit disk graphs that runs in time 2^{?(?k)}(n+m). Under the Exponential Time Hypothesis, Long Path and Long Cycle on unit disk graphs cannot be solved in time 2^{o(?k)}(n+m)^?(1) [de Berg et al., STOC 2018], hence our algorithm is optimal. Besides the 2^{?(?k)}(n+m)^?(1)-time algorithm for the (arguably) much simpler Vertex Cover problem by de Berg et al. [STOC 2018] (which easily follows from the existence of a 2k-vertex kernel for the problem), this is the only known ETH-optimal fixed-parameter tractable algorithm on UDGs. Previously, Long Path and Long Cycle on unit disk graphs were only known to be solvable in time 2^{?(?klog k)}(n+m). This algorithm involved the introduction of a new type of a tree decomposition, entailing the design of a very tedious dynamic programming procedure. Our algorithm is substantially simpler: we completely avoid the use of this new type of tree decomposition. Instead, we use a marking procedure to reduce the problem to (a weighted version of) itself on a standard tree decomposition of width ?(?k)

    Twin-Width and Polynomial Kernels

    Get PDF
    We study the existence of polynomial kernels for parameterized problems without a polynomial kernel on general graphs, when restricted to graphs of bounded twin-width. It was previously observed in [Bonnet et al., ICALP\u2721] that the problem k-Independent Set allows no polynomial kernel on graph of bounded twin-width by a very simple argument, which extends to several other problems such as k-Independent Dominating Set, k-Path, k-Induced Path, k-Induced Matching. In this work, we examine the k-Dominating Set and variants of k-Vertex Cover for the existence of polynomial kernels. As a main result, we show that k-Dominating Set does not admit a polynomial kernel on graphs of twin-width at most 4 under a standard complexity-theoretic assumption. The reduction is intricate, especially due to the effort to bring the twin-width down to 4, and it can be tweaked to work for Connected k-Dominating Set and Total k-Dominating Set with a slightly worse bound on the twin-width. On the positive side, we obtain a simple quadratic vertex kernel for Connected k-Vertex Cover and Capacitated k-Vertex Cover on graphs of bounded twin-width. These kernels rely on that graphs of bounded twin-width have Vapnik-Chervonenkis (VC) density 1, that is, for any vertex set X, the number of distinct neighborhoods in X is at most c?|X|, where c is a constant depending only on the twin-width. Interestingly the kernel applies to any graph class of VC density 1, and does not require a witness sequence. We also present a more intricate O(k^{1.5}) vertex kernel for Connected k-Vertex Cover. Finally we show that deciding if a graph has twin-width at most 1 can be done in polynomial time, and observe that most graph optimization/decision problems can be solved in polynomial time on graphs of twin-width at most 1

    Grundy Distinguishes Treewidth from Pathwidth

    Get PDF
    Structural graph parameters, such as treewidth, pathwidth, and clique-width, are a central topic of study in parameterized complexity. A main aim of research in this area is to understand the "price of generality" of these widths: as we transition from more restrictive to more general notions, which are the problems that see their complexity status deteriorate from fixed-parameter tractable to intractable? This type of question is by now very well-studied, but, somewhat strikingly, the algorithmic frontier between the two (arguably) most central width notions, treewidth and pathwidth, is still not understood: currently, no natural graph problem is known to be W-hard for one but FPT for the other. Indeed, a surprising development of the last few years has been the observation that for many of the most paradigmatic problems, their complexities for the two parameters actually coincide exactly, despite the fact that treewidth is a much more general parameter. It would thus appear that the extra generality of treewidth over pathwidth often comes "for free". Our main contribution in this paper is to uncover the first natural example where this generality comes with a high price. We consider Grundy Coloring, a variation of coloring where one seeks to calculate the worst possible coloring that could be assigned to a graph by a greedy First-Fit algorithm. We show that this well-studied problem is FPT parameterized by pathwidth; however, it becomes significantly harder (W[1]-hard) when parameterized by treewidth. Furthermore, we show that Grundy Coloring makes a second complexity jump for more general widths, as it becomes para-NP-hard for clique-width. Hence, Grundy Coloring nicely captures the complexity trade-offs between the three most well-studied parameters. Completing the picture, we show that Grundy Coloring is FPT parameterized by modular-width.Comment: To be published in proceedings of ESA 202
    corecore