100 research outputs found

    Parameterized Algorithms for List K-Cycle

    Get PDF
    The classic K-Cycle problem asks if a graph G, with vertex set V(G), has a simple cycle containing all vertices of a given set K subseteq V(G). In terms of colored graphs, it can be rephrased as follows: Given a graph G, a set K subset of V(G) and an injective coloring c from K to {1,2,...,|K|}, decide if G has a simple cycle containing each color in {1,2,...,|K|} (once). Another problem widely known since the introduction of color coding is {Colorful Cycle}. Given a graph G and a coloring c from V(G) to {1,2,...,k} for some natural number k, it asks if G has a simple cycle of length k containing each color in {1,2,...,k} (once). We study a generalization of these problems: Given a graph G, a set K subset of V(G), a list-coloring L from K to 2^{{1,2,...,k^*}} for some natural number k^* and a parameter k, List K-Cycle asks if one can assign a color to each vertex in K so that G would have a simple cycle (of arbitrary length) containing exactly k vertices from K with distinct colors. We design a randomized algorithm for List K-Cycle running in time 2^kn^{O(1)} on an -vertex graph, matching the best known running times of algorithms for both K-Cycle and Colorful Cycle. Moreover, unless the Set Cover Conjecture is false, our algorithm is essentially optimal. We also study a variant of List K-Cycle that generalizes the classic Hamiltonicity problem, where one specifies the size of a solution. Our results integrate three related algebraic approaches, introduced by Bjorklund, Husfeldt and Taslaman (SODA\u2712), Bjorklund, Kaski and Kowalik (STACS\u2713), and Bjorklund (FOCS\u2710)

    Lossy Kernelization

    Get PDF
    In this paper we propose a new framework for analyzing the performance of preprocessing algorithms. Our framework builds on the notion of kernelization from parameterized complexity. However, as opposed to the original notion of kernelization, our definitions combine well with approximation algorithms and heuristics. The key new definition is that of a polynomial size α\alpha-approximate kernel. Loosely speaking, a polynomial size α\alpha-approximate kernel is a polynomial time pre-processing algorithm that takes as input an instance (I,k)(I,k) to a parameterized problem, and outputs another instance (I,k)(I',k') to the same problem, such that I+kkO(1)|I'|+k' \leq k^{O(1)}. Additionally, for every c1c \geq 1, a cc-approximate solution ss' to the pre-processed instance (I,k)(I',k') can be turned in polynomial time into a (cα)(c \cdot \alpha)-approximate solution ss to the original instance (I,k)(I,k). Our main technical contribution are α\alpha-approximate kernels of polynomial size for three problems, namely Connected Vertex Cover, Disjoint Cycle Packing and Disjoint Factors. These problems are known not to admit any polynomial size kernels unless NPcoNP/polyNP \subseteq coNP/poly. Our approximate kernels simultaneously beat both the lower bounds on the (normal) kernel size, and the hardness of approximation lower bounds for all three problems. On the negative side we prove that Longest Path parameterized by the length of the path and Set Cover parameterized by the universe size do not admit even an α\alpha-approximate kernel of polynomial size, for any α1\alpha \geq 1, unless NPcoNP/polyNP \subseteq coNP/poly. In order to prove this lower bound we need to combine in a non-trivial way the techniques used for showing kernelization lower bounds with the methods for showing hardness of approximationComment: 58 pages. Version 2 contain new results: PSAKS for Cycle Packing and approximate kernel lower bounds for Set Cover and Hitting Set parameterized by universe siz

    Complexity of the Steiner Network Problem with Respect to the Number of Terminals

    Get PDF
    In the Directed Steiner Network problem we are given an arc-weighted digraph GG, a set of terminals TV(G)T \subseteq V(G), and an (unweighted) directed request graph RR with V(R)=TV(R)=T. Our task is to output a subgraph GGG' \subseteq G of the minimum cost such that there is a directed path from ss to tt in GG' for all stA(R)st \in A(R). It is known that the problem can be solved in time V(G)O(A(R))|V(G)|^{O(|A(R)|)} [Feldman&Ruhl, SIAM J. Comput. 2006] and cannot be solved in time V(G)o(A(R))|V(G)|^{o(|A(R)|)} even if GG is planar, unless Exponential-Time Hypothesis (ETH) fails [Chitnis et al., SODA 2014]. However, as this reduction (and other reductions showing hardness of the problem) only shows that the problem cannot be solved in time V(G)o(T)|V(G)|^{o(|T|)} unless ETH fails, there is a significant gap in the complexity with respect to T|T| in the exponent. We show that Directed Steiner Network is solvable in time f(R)V(G)O(cgT)f(R)\cdot |V(G)|^{O(c_g \cdot |T|)}, where cgc_g is a constant depending solely on the genus of GG and ff is a computable function. We complement this result by showing that there is no f(R)V(G)o(T2/logT)f(R)\cdot |V(G)|^{o(|T|^2/ \log |T|)} algorithm for any function ff for the problem on general graphs, unless ETH fails

    Parameterized Algorithms on Perfect Graphs for deletion to (r,)(r,\ell)-graphs

    Get PDF
    For fixed integers r,0r,\ell \geq 0, a graph GG is called an {\em (r,)(r,\ell)-graph} if the vertex set V(G)V(G) can be partitioned into rr independent sets and \ell cliques. The class of (r,)(r, \ell) graphs generalizes rr-colourable graphs (when =0)\ell =0) and hence not surprisingly, determining whether a given graph is an (r,)(r, \ell)-graph is \NP-hard even when r3r \geq 3 or 3\ell \geq 3 in general graphs. When rr and \ell are part of the input, then the recognition problem is NP-hard even if the input graph is a perfect graph (where the {\sc Chromatic Number} problem is solvable in polynomial time). It is also known to be fixed-parameter tractable (FPT) on perfect graphs when parameterized by rr and \ell. I.e. there is an f(r+\ell) \cdot n^{\Oh(1)} algorithm on perfect graphs on nn vertices where ff is some (exponential) function of rr and \ell. In this paper, we consider the parameterized complexity of the following problem, which we call {\sc Vertex Partization}. Given a perfect graph GG and positive integers r,,kr,\ell,k decide whether there exists a set SV(G)S\subseteq V(G) of size at most kk such that the deletion of SS from GG results in an (r,)(r,\ell)-graph. We obtain the following results: \begin{enumerate} \item {\sc Vertex Partization} on perfect graphs is FPT when parameterized by k+r+k+r+\ell. \item The problem does not admit any polynomial sized kernel when parameterized by k+r+k+r+\ell. In other words, in polynomial time, the input graph can not be compressed to an equivalent instance of size polynomial in k+r+k+r+\ell. In fact, our result holds even when k=0k=0. \item When r,r,\ell are universal constants, then {\sc Vertex Partization} on perfect graphs, parameterized by kk, has a polynomial sized kernel. \end{enumerate

    Refined Complexity of PCA with Outliers

    Get PDF
    Principal component analysis (PCA) is one of the most fundamental procedures in exploratory data analysis and is the basic step in applications ranging from quantitative finance and bioinformatics to image analysis and neuroscience. However, it is well-documented that the applicability of PCA in many real scenarios could be constrained by an "immune deficiency" to outliers such as corrupted observations. We consider the following algorithmic question about the PCA with outliers. For a set of nn points in Rd\mathbb{R}^{d}, how to learn a subset of points, say 1% of the total number of points, such that the remaining part of the points is best fit into some unknown rr-dimensional subspace? We provide a rigorous algorithmic analysis of the problem. We show that the problem is solvable in time nO(d2)n^{O(d^2)}. In particular, for constant dimension the problem is solvable in polynomial time. We complement the algorithmic result by the lower bound, showing that unless Exponential Time Hypothesis fails, in time f(d)no(d)f(d)n^{o(d)}, for any function ff of dd, it is impossible not only to solve the problem exactly but even to approximate it within a constant factor.Comment: To be presented at ICML 201

    Covering Small Independent Sets and Separators with Applications to Parameterized Algorithms

    Full text link
    We present two new combinatorial tools for the design of parameterized algorithms. The first is a simple linear time randomized algorithm that given as input a dd-degenerate graph GG and an integer kk, outputs an independent set YY, such that for every independent set XX in GG of size at most kk, the probability that XX is a subset of YY is at least (((d+1)kk)k(d+1))1\left({(d+1)k \choose k} \cdot k(d+1)\right)^{-1}.The second is a new (deterministic) polynomial time graph sparsification procedure that given a graph GG, a set T={{s1,t1},{s2,t2},,{s,t}}T = \{\{s_1, t_1\}, \{s_2, t_2\}, \ldots, \{s_\ell, t_\ell\}\} of terminal pairs and an integer kk, returns an induced subgraph GG^\star of GG that maintains all the inclusion minimal multicuts of GG of size at most kk, and does not contain any (k+2)(k+2)-vertex connected set of size 2O(k)2^{{\cal O}(k)}. In particular, GG^\star excludes a clique of size 2O(k)2^{{\cal O}(k)} as a topological minor. Put together, our new tools yield new randomized fixed parameter tractable (FPT) algorithms for Stable ss-tt Separator, Stable Odd Cycle Transversal and Stable Multicut on general graphs, and for Stable Directed Feedback Vertex Set on dd-degenerate graphs, resolving two problems left open by Marx et al. [ACM Transactions on Algorithms, 2013]. All of our algorithms can be derandomized at the cost of a small overhead in the running time.Comment: 35 page
    corecore