24 research outputs found
Tight bounds for parameterized complexity of Cluster Editing
In the Correlation Clustering problem, also known as Cluster Editing, we are given an undirected graph G and a positive integer k; the task is to decide whether G can be transformed into a cluster graph, i.e., a disjoint union of cliques, by changing at most k adjacencies, that is, by adding or deleting at most k edges. The motivation of the problem stems from various tasks in computational biology (Ben-Dor et al., Journal of Computational Biology 1999) and machine learning (Bansal et al., Machine Learning 2004). Although in general Correlation Clustering is APX-hard (Charikar et al., FOCS 2003), the version of the problem where the number of cliques may not exceed a prescribed constant p admits a PTAS (Giotis and Guruswami, SODA 2006). We study the parameterized complexity of Correlation Clustering with this restriction on the number of cliques to be created. We give an algorithm that - in time O(2^{O(sqrt{pk})} + n+m) decides whether a graph G on n vertices and m edges can be transformed into a cluster graph with exactly p cliques by changing at most k adjacencies. We complement these algorithmic findings by the following, surprisingly tight lower bound on the asymptotic behavior of our algorithm. We show that unless the Exponential Time Hypothesis (ETH) fails - for any constant 0 <= sigma <= 1, there is p = Theta(k^sigma) such that there is no algorithm deciding in time 2^{o(sqrt{pk})} n^{O(1)} whether an n-vertex graph G can be transformed into a cluster graph with at most p cliques by changing at most k adjacencies. Thus, our upper and lower bounds provide an asymptotically tight analysis of the multivariate parameterized complexity of the problem for the whole range of values of p from constant to a linear function of k.publishedVersio
Tight bounds for parameterized complexity of cluster editing with a small number of clusters
In the Cluster Editing problem, also known as Correlation Clustering, we are given an undirected n-vertex graph G and a positive integer k. The task is to decide if G can be transformed into a cluster graph, i.e., a disjoint union of cliques, by changing at most k adjacencies, i.e. by adding/deleting at most k edges. We give a subexponential-time parameterized algorithm that in time View the MathML source decides whether G can be transformed into a cluster graph with exactly p cliques by changing at most k adjacencies. Our algorithmic findings are complemented by the following tight lower bound on the asymptotic behavior of our algorithm. We show that unless ETH fails, for any constant 0<σ≤1, there is p=Θ(kσ) such that there is no algorithm deciding in time View the MathML source whether G can be transformed into a cluster graph with at most p cliques by changing at most k adjacencies
Fast branching algorithm for Cluster Vertex Deletion
In the family of clustering problems, we are given a set of objects (vertices
of the graph), together with some observed pairwise similarities (edges). The
goal is to identify clusters of similar objects by slightly modifying the graph
to obtain a cluster graph (disjoint union of cliques). Hueffner et al. [Theory
Comput. Syst. 2010] initiated the parameterized study of Cluster Vertex
Deletion, where the allowed modification is vertex deletion, and presented an
elegant O(2^k * k^9 + n * m)-time fixed-parameter algorithm, parameterized by
the solution size. In our work, we pick up this line of research and present an
O(1.9102^k * (n + m))-time branching algorithm
Fast Biclustering by Dual Parameterization
We study two clustering problems, Starforest Editing, the problem of adding
and deleting edges to obtain a disjoint union of stars, and the generalization
Bicluster Editing. We show that, in addition to being NP-hard, none of the
problems can be solved in subexponential time unless the exponential time
hypothesis fails.
Misra, Panolan, and Saurabh (MFCS 2013) argue that introducing a bound on the
number of connected components in the solution should not make the problem
easier: In particular, they argue that the subexponential time algorithm for
editing to a fixed number of clusters (p-Cluster Editing) by Fomin et al. (J.
Comput. Syst. Sci., 80(7) 2014) is an exception rather than the rule. Here, p
is a secondary parameter, bounding the number of components in the solution.
However, upon bounding the number of stars or bicliques in the solution, we
obtain algorithms which run in time for p-Starforest
Editing and for p-Bicluster Editing. We
obtain a similar result for the more general case of t-Partite p-Cluster
Editing. This is subexponential in k for fixed number of clusters, since p is
then considered a constant.
Our results even out the number of multivariate subexponential time
algorithms and give reasons to believe that this area warrants further study.Comment: Accepted for presentation at IPEC 201
Fast Parallel Fixed-Parameter Algorithms via Color Coding
Fixed-parameter algorithms have been successfully applied to solve numerous
difficult problems within acceptable time bounds on large inputs. However, most
fixed-parameter algorithms are inherently \emph{sequential} and, thus, make no
use of the parallel hardware present in modern computers. We show that parallel
fixed-parameter algorithms do not only exist for numerous parameterized
problems from the literature -- including vertex cover, packing problems,
cluster editing, cutting vertices, finding embeddings, or finding matchings --
but that there are parallel algorithms working in \emph{constant} time or at
least in time \emph{depending only on the parameter} (and not on the size of
the input) for these problems. Phrased in terms of complexity classes, we place
numerous natural parameterized problems in parameterized versions of AC. On
a more technical level, we show how the \emph{color coding} method can be
implemented in constant time and apply it to embedding problems for graphs of
bounded tree-width or tree-depth and to model checking first-order formulas in
graphs of bounded degree
Parameterized Complexity of Biclique Contraction and Balanced Biclique Contraction
In this work, we initiate the complexity study of Biclique Contraction and
Balanced Biclique Contraction. In these problems, given as input a graph G and
an integer k, the objective is to determine whether one can contract at most k
edges in G to obtain a biclique and a balanced biclique, respectively. We first
prove that these problems are NP-complete even when the input graph is
bipartite. Next, we study the parameterized complexity of these problems and
show that they admit single exponential-time FPT algorithms when parameterized
by the number k of edge contractions. Then, we show that Balanced Biclique
Contraction admits a quadratic vertex kernel while Biclique Contraction does
not admit any polynomial compression (or kernel) under standard
complexity-theoretic assumptions