1,577 research outputs found

    Cluster Editing: Kernelization based on Edge Cuts

    Full text link
    Kernelization algorithms for the {\sc cluster editing} problem have been a popular topic in the recent research in parameterized computation. Thus far most kernelization algorithms for this problem are based on the concept of {\it critical cliques}. In this paper, we present new observations and new techniques for the study of kernelization algorithms for the {\sc cluster editing} problem. Our techniques are based on the study of the relationship between {\sc cluster editing} and graph edge-cuts. As an application, we present an O(n2){\cal O}(n^2)-time algorithm that constructs a 2k2k kernel for the {\it weighted} version of the {\sc cluster editing} problem. Our result meets the best kernel size for the unweighted version for the {\sc cluster editing} problem, and significantly improves the previous best kernel of quadratic size for the weighted version of the problem

    (Sub)linear Kernels for Edge Modification Problems Towards Structured Graph Classes

    Get PDF
    In a (parameterized) graph edge modification problem, we are given a graph G, an integer k and a (usually well-structured) class of graphs ?, and ask whether it is possible to transform G into a graph G\u27 ? ? by adding and/or removing at most k edges. Parameterized graph edge modification problems received considerable attention in the last decades. In this paper, we focus on finding small kernels for edge modification problems. One of the most studied problems is the Cluster Editing problem, in which the goal is to partition the vertex set into a disjoint union of cliques. Even if this problem admits a 2k kernel [Cao and Chen, 2012], this kernel does not reduce the size of most instances. Therefore, we explore the question of whether linear kernels are a theoretical limit in edge modification problems, in particular when the target graphs are very structured (such as a partition into cliques for instance). We prove, as far as we know, the first sublinear kernel for an edge modification problem. Namely, we show that Clique + Independent Set Deletion, which is a restriction of Cluster Deletion, admits a kernel of size O(k/log k). We also obtain small kernels for several other edge modification problems. We prove that Split Addition (and the equivalent Split Deletion) admits a linear kernel, improving the existing quadratic kernel of Ghosh et al. [Ghosh et al., 2015]. We complement this result by proving that Trivially Perfect Addition admits a quadratic kernel (improving the cubic kernel of Guo [Guo, 2007]), and finally prove that its triangle-free version (Starforest Deletion) admits a linear kernel, which is optimal under ETH

    Fast Biclustering by Dual Parameterization

    Get PDF
    We study two clustering problems, Starforest Editing, the problem of adding and deleting edges to obtain a disjoint union of stars, and the generalization Bicluster Editing. We show that, in addition to being NP-hard, none of the problems can be solved in subexponential time unless the exponential time hypothesis fails. Misra, Panolan, and Saurabh (MFCS 2013) argue that introducing a bound on the number of connected components in the solution should not make the problem easier: In particular, they argue that the subexponential time algorithm for editing to a fixed number of clusters (p-Cluster Editing) by Fomin et al. (J. Comput. Syst. Sci., 80(7) 2014) is an exception rather than the rule. Here, p is a secondary parameter, bounding the number of components in the solution. However, upon bounding the number of stars or bicliques in the solution, we obtain algorithms which run in time 25pk+O(n+m)2^{5 \sqrt{pk}} + O(n+m) for p-Starforest Editing and 2O(pklog(pk))+O(n+m)2^{O(p \sqrt{k} \log(pk))} + O(n+m) for p-Bicluster Editing. We obtain a similar result for the more general case of t-Partite p-Cluster Editing. This is subexponential in k for fixed number of clusters, since p is then considered a constant. Our results even out the number of multivariate subexponential time algorithms and give reasons to believe that this area warrants further study.Comment: Accepted for presentation at IPEC 201

    Streaming Kernelization

    Full text link
    Kernelization is a formalization of preprocessing for combinatorially hard problems. We modify the standard definition for kernelization, which allows any polynomial-time algorithm for the preprocessing, by requiring instead that the preprocessing runs in a streaming setting and uses O(poly(k)logx)\mathcal{O}(poly(k)\log|x|) bits of memory on instances (x,k)(x,k). We obtain several results in this new setting, depending on the number of passes over the input that such a streaming kernelization is allowed to make. Edge Dominating Set turns out as an interesting example because it has no single-pass kernelization but two passes over the input suffice to match the bounds of the best standard kernelization

    Kernelization and Enumeration: New Approaches to Solving Hard Problems

    Get PDF
    NP-Hardness is a well-known theory to identify the hardness of computational problems. It is believed that NP-Hard problems are unlikely to admit polynomial-time algorithms. However since many NP-Hard problems are of practical significance, different approaches are proposed to solve them: Approximation algorithms, randomized algorithms and heuristic algorithms. None of the approaches meet the practical needs. Recently parameterized computation and complexity has attracted a lot of attention and been a fruitful branch of the study of efficient algorithms. By taking advantage of the moderate value of parameters in many practical instances, we can design efficient algorithms for the NP-Hard problems in practice. In this dissertation, we discuss a new approach to design efficient parameterized algorithms, kernelization. The motivation is that instances of small size are easier to solve. Roughly speaking, kernelization is a preprocess on the input instances and is able to significantly reduce their sizes. We present a 2k kernel for the cluster editing problem, which improves the previous best kernel of size 4k; We also present a linear kernel of size 7k 2d for the d-cluster editing problem, which is the first linear kernel for the problem. The kernelization algorithm is simple and easy to implement. We propose a quadratic kernel for the pseudo-achromatic number problem. This implies that the problem is tractable in term of parameterized complexity. We also study the general problem, the vertex grouping problem and prove it is intractable in term of parameterized complexity. In practice, many problems seek a set of good solutions instead of a good solution. Motivated by this, we present the framework to study enumerability in term of parameterized complexity. We study three popular techniques for the design of parameterized algorithms, and show that combining with effective enumeration techniques, they could be transferred to design efficient enumeration algorithms

    Polynomial kernels for 3-leaf power graph modification problems

    Full text link
    A graph G=(V,E) is a 3-leaf power iff there exists a tree T whose leaves are V and such that (u,v) is an edge iff u and v are at distance at most 3 in T. The 3-leaf power graph edge modification problems, i.e. edition (also known as the closest 3-leaf power), completion and edge-deletion, are FTP when parameterized by the size of the edge set modification. However polynomial kernel was known for none of these three problems. For each of them, we provide cubic kernels that can be computed in linear time for each of these problems. We thereby answer an open problem first mentioned by Dom, Guo, Huffner and Niedermeier (2005).Comment: Submitte

    The Graph Motif problem parameterized by the structure of the input graph

    Full text link
    The Graph Motif problem was introduced in 2006 in the context of biological networks. It consists of deciding whether or not a multiset of colors occurs in a connected subgraph of a vertex-colored graph. Graph Motif has been mostly analyzed from the standpoint of parameterized complexity. The main parameters which came into consideration were the size of the multiset and the number of colors. Though, in the many applications of Graph Motif, the input graph originates from real-life and has structure. Motivated by this prosaic observation, we systematically study its complexity relatively to graph structural parameters. For a wide range of parameters, we give new or improved FPT algorithms, or show that the problem remains intractable. For the FPT cases, we also give some kernelization lower bounds as well as some ETH-based lower bounds on the worst case running time. Interestingly, we establish that Graph Motif is W[1]-hard (while in W[P]) for parameter max leaf number, which is, to the best of our knowledge, the first problem to behave this way.Comment: 24 pages, accepted in DAM, conference version in IPEC 201
    corecore