208 research outputs found

    Exploring Subexponential Parameterized Complexity of Completion Problems

    Get PDF
    Let F{\cal F} be a family of graphs. In the F{\cal F}-Completion problem, we are given a graph GG and an integer kk as input, and asked whether at most kk edges can be added to GG so that the resulting graph does not contain a graph from F{\cal F} as an induced subgraph. It appeared recently that special cases of F{\cal F}-Completion, the problem of completing into a chordal graph known as Minimum Fill-in, corresponding to the case of F={C4,C5,C6,}{\cal F}=\{C_4,C_5,C_6,\ldots\}, and the problem of completing into a split graph, i.e., the case of F={C4,2K2,C5}{\cal F}=\{C_4, 2K_2, C_5\}, are solvable in parameterized subexponential time 2O(klogk)nO(1)2^{O(\sqrt{k}\log{k})}n^{O(1)}. The exploration of this phenomenon is the main motivation for our research on F{\cal F}-Completion. In this paper we prove that completions into several well studied classes of graphs without long induced cycles also admit parameterized subexponential time algorithms by showing that: - The problem Trivially Perfect Completion is solvable in parameterized subexponential time 2O(klogk)nO(1)2^{O(\sqrt{k}\log{k})}n^{O(1)}, that is F{\cal F}-Completion for F={C4,P4}{\cal F} =\{C_4, P_4\}, a cycle and a path on four vertices. - The problems known in the literature as Pseudosplit Completion, the case where F={2K2,C4}{\cal F} = \{2K_2, C_4\}, and Threshold Completion, where F={2K2,P4,C4}{\cal F} = \{2K_2, P_4, C_4\}, are also solvable in time 2O(klogk)nO(1)2^{O(\sqrt{k}\log{k})} n^{O(1)}. We complement our algorithms for F{\cal F}-Completion with the following lower bounds: - For F={2K2}{\cal F} = \{2K_2\}, F={C4}{\cal F} = \{C_4\}, F={P4}{\cal F} = \{P_4\}, and F={2K2,P4}{\cal F} = \{2K_2, P_4\}, F{\cal F}-Completion cannot be solved in time 2o(k)nO(1)2^{o(k)} n^{O(1)} unless the Exponential Time Hypothesis (ETH) fails. Our upper and lower bounds provide a complete picture of the subexponential parameterized complexity of F{\cal F}-Completion problems for F{2K2,C4,P4}{\cal F}\subseteq\{2K_2, C_4, P_4\}.Comment: 32 pages, 16 figures, A preliminary version of this paper appeared in the proceedings of STACS'1

    Fast Biclustering by Dual Parameterization

    Get PDF
    We study two clustering problems, Starforest Editing, the problem of adding and deleting edges to obtain a disjoint union of stars, and the generalization Bicluster Editing. We show that, in addition to being NP-hard, none of the problems can be solved in subexponential time unless the exponential time hypothesis fails. Misra, Panolan, and Saurabh (MFCS 2013) argue that introducing a bound on the number of connected components in the solution should not make the problem easier: In particular, they argue that the subexponential time algorithm for editing to a fixed number of clusters (p-Cluster Editing) by Fomin et al. (J. Comput. Syst. Sci., 80(7) 2014) is an exception rather than the rule. Here, p is a secondary parameter, bounding the number of components in the solution. However, upon bounding the number of stars or bicliques in the solution, we obtain algorithms which run in time 25pk+O(n+m)2^{5 \sqrt{pk}} + O(n+m) for p-Starforest Editing and 2O(pklog(pk))+O(n+m)2^{O(p \sqrt{k} \log(pk))} + O(n+m) for p-Bicluster Editing. We obtain a similar result for the more general case of t-Partite p-Cluster Editing. This is subexponential in k for fixed number of clusters, since p is then considered a constant. Our results even out the number of multivariate subexponential time algorithms and give reasons to believe that this area warrants further study.Comment: Accepted for presentation at IPEC 201

    On the Threshold of Intractability

    Full text link
    We study the computational complexity of the graph modification problems Threshold Editing and Chain Editing, adding and deleting as few edges as possible to transform the input into a threshold (or chain) graph. In this article, we show that both problems are NP-complete, resolving a conjecture by Natanzon, Shamir, and Sharan (Discrete Applied Mathematics, 113(1):109--128, 2001). On the positive side, we show the problem admits a quadratic vertex kernel. Furthermore, we give a subexponential time parameterized algorithm solving Threshold Editing in 2O(klogk)+poly(n)2^{O(\surd k \log k)} + \text{poly}(n) time, making it one of relatively few natural problems in this complexity class on general graphs. These results are of broader interest to the field of social network analysis, where recent work of Brandes (ISAAC, 2014) posits that the minimum edit distance to a threshold graph gives a good measure of consistency for node centralities. Finally, we show that all our positive results extend to the related problem of Chain Editing, as well as the completion and deletion variants of both problems

    Polynomial kernelization for removing induced claws and diamonds

    Full text link
    A graph is called (claw,diamond)-free if it contains neither a claw (a K1,3K_{1,3}) nor a diamond (a K4K_4 with an edge removed) as an induced subgraph. Equivalently, (claw,diamond)-free graphs can be characterized as line graphs of triangle-free graphs, or as linear dominoes, i.e., graphs in which every vertex is in at most two maximal cliques and every edge is in exactly one maximal clique. In this paper we consider the parameterized complexity of the (claw,diamond)-free Edge Deletion problem, where given a graph GG and a parameter kk, the question is whether one can remove at most kk edges from GG to obtain a (claw,diamond)-free graph. Our main result is that this problem admits a polynomial kernel. We complement this finding by proving that, even on instances with maximum degree 66, the problem is NP-complete and cannot be solved in time 2o(k)V(G)O(1)2^{o(k)}\cdot |V(G)|^{O(1)} unless the Exponential Time Hypothesis fai

    Exploiting Dense Structures in Parameterized Complexity

    Get PDF
    Over the past few decades, the study of dense structures from the perspective of approximation algorithms has become a wide area of research. However, from the viewpoint of parameterized algorithm, this area is largely unexplored. In particular, properties of random samples have been successfully deployed to design approximation schemes for a number of fundamental problems on dense structures [Arora et al. FOCS 1995, Goldreich et al. FOCS 1996, Giotis and Guruswami SODA 2006, Karpinksi and Schudy STOC 2009]. In this paper, we fill this gap, and harness the power of random samples as well as structure theory to design kernelization as well as parameterized algorithms on dense structures. In particular, we obtain linear vertex kernels for Edge-Disjoint Paths, Edge Odd Cycle Transversal, Minimum Bisection, d-Way Cut, Multiway Cut and Multicut on everywhere dense graphs. In fact, these kernels are obtained by designing a polynomial-time algorithm when the corresponding parameter is at most ?(n). Additionally, we obtain a cubic kernel for Vertex-Disjoint Paths on everywhere dense graphs. In addition to kernelization results, we obtain randomized subexponential-time parameterized algorithms for Edge Odd Cycle Transversal, Minimum Bisection, and d-Way Cut. Finally, we show how all of our results (as well as EPASes for these problems) can be de-randomized

    Fast Biclustering by Dual Parameterization

    Get PDF
    We study two clustering problems, Starforest Editing, the problem of adding and deleting edges to obtain a disjoint union of stars, and the generalization Bicluster Editing. We show that, in addition to being NP-hard, none of the problems can be solved in subexponential time unless the exponential time hypothesis fails. Misra, Panolan, and Saurabh (MFCS 2013) argue that introducing a bound on the number of connected components in the solution should not make the problem easier: In particular, they argue that the subexponential time algorithm for editing to a fixed number of clusters (p-Cluster Editing) by Fomin et al. (J. Comput. Syst. Sci., 80(7) 2014) is an exception rather than the rule. Here, p is a secondary parameter, bounding the number of components in the solution. However, upon bounding the number of stars or bicliques in the solution, we obtain algorithms which run in time O(2^{3*sqrt(pk)} + n + m) for p-Starforest Editing and O(2^{O(p * sqrt(k) * log(pk))} + n + m) for p-Bicluster Editing. We obtain a similar result for the more general case of t-Partite p-Cluster Editing. This is subexponential in k for a fixed number of clusters, since p is then considered a constant. Our results even out the number of multivariate subexponential time algorithms and give reasons to believe that this area warrants further study
    corecore