127 research outputs found
On Computing Centroids According to the p-Norms of Hamming Distance Vectors
In this paper we consider the p-Norm Hamming Centroid problem which asks to determine whether some given strings have a centroid with a bound on the p-norm of its Hamming distances to the strings. Specifically, given a set S of strings and a real k, we consider the problem of determining whether there exists a string s^* with (sum_{s in S} d^{p}(s^*,s))^(1/p) <=k, where d(,) denotes the Hamming distance metric. This problem has important applications in data clustering and multi-winner committee elections, and is a generalization of the well-known polynomial-time solvable Consensus String (p=1) problem, as well as the NP-hard Closest String (p=infty) problem.
Our main result shows that the problem is NP-hard for all fixed rational p > 1, closing the gap for all rational values of p between 1 and infty. Under standard complexity assumptions the reduction also implies that the problem has no 2^o(n+m)-time or 2^o(k^(p/(p+1)))-time algorithm, where m denotes the number of input strings and n denotes the length of each string, for any fixed p > 1. The first bound matches a straightforward brute-force algorithm. The second bound is tight in the sense that for each fixed epsilon > 0, we provide a 2^(k^(p/((p+1))+epsilon))-time algorithm. In the last part of the paper, we complement our hardness result by presenting a fixed-parameter algorithm and a factor-2 approximation algorithm for the problem
Parameterized Complexity of Critical Node Cuts
We consider the following natural graph cut problem called Critical Node Cut
(CNC): Given a graph on vertices, and two positive integers and
, determine whether has a set of vertices whose removal leaves
with at most connected pairs of vertices. We analyze this problem in the
framework of parameterized complexity. That is, we are interested in whether or
not this problem is solvable in time (i.e., whether
or not it is fixed-parameter tractable), for various natural parameters
. We consider four such parameters:
- The size of the required cut.
- The upper bound on the number of remaining connected pairs.
- The lower bound on the number of connected pairs to be removed.
- The treewidth of .
We determine whether or not CNC is fixed-parameter tractable for each of
these parameters. We determine this also for all possible aggregations of these
four parameters, apart from . Moreover, we also determine whether or not
CNC admits a polynomial kernel for all these parameterizations. That is,
whether or not there is an algorithm that reduces each instance of CNC in
polynomial time to an equivalent instance of size , where
is the given parameter
Parameterized Two-Player Nash Equilibrium
We study the computation of Nash equilibria in a two-player normal form game
from the perspective of parameterized complexity. Recent results proved
hardness for a number of variants, when parameterized by the support size. We
complement those results, by identifying three cases in which the problem
becomes fixed-parameter tractable. These cases occur in the previously studied
settings of sparse games and unbalanced games as well as in the newly
considered case of locally bounded treewidth games that generalizes both these
two cases
Parameterized Complexity Dichotomy for Steiner Multicut
The Steiner Multicut problem asks, given an undirected graph G, terminals
sets T1,...,Tt V(G) of size at most p, and an integer k, whether
there is a set S of at most k edges or nodes s.t. of each set Ti at least one
pair of terminals is in different connected components of G \ S. This problem
generalizes several graph cut problems, in particular the Multicut problem (the
case p = 2), which is fixed-parameter tractable for the parameter k [Marx and
Razgon, Bousquet et al., STOC 2011].
We provide a dichotomy of the parameterized complexity of Steiner Multicut.
That is, for any combination of k, t, p, and the treewidth tw(G) as constant,
parameter, or unbounded, and for all versions of the problem (edge deletion and
node deletion with and without deletable terminals), we prove either that the
problem is fixed-parameter tractable or that the problem is hard (W[1]-hard or
even (para-)NP-complete). We highlight that:
- The edge deletion version of Steiner Multicut is fixed-parameter tractable
for the parameter k+t on general graphs (but has no polynomial kernel, even on
trees). We present two proofs: one using the randomized contractions technique
of Chitnis et al, and one relying on new structural lemmas that decompose the
Steiner cut into important separators and minimal s-t cuts.
- In contrast, both node deletion versions of Steiner Multicut are W[1]-hard
for the parameter k+t on general graphs.
- All versions of Steiner Multicut are W[1]-hard for the parameter k, even
when p=3 and the graph is a tree plus one node. Hence, the results of Marx and
Razgon, and Bousquet et al. do not generalize to Steiner Multicut.
Since we allow k, t, p, and tw(G) to be any constants, our characterization
includes a dichotomy for Steiner Multicut on trees (for tw(G) = 1), and a
polynomial time versus NP-hardness dichotomy (by restricting k,t,p,tw(G) to
constant or unbounded).Comment: As submitted to journal. This version also adds a proof of
fixed-parameter tractability for parameter k+t using the technique of
randomized contraction
Hierarchies of Inefficient Kernelizability
The framework of Bodlaender et al. (ICALP 2008) and Fortnow and Santhanam
(STOC 2008) allows us to exclude the existence of polynomial kernels for a
range of problems under reasonable complexity-theoretical assumptions. However,
there are also some issues that are not addressed by this framework, including
the existence of Turing kernels such as the "kernelization" of Leaf Out
Branching(k) into a disjunction over n instances of size poly(k). Observing
that Turing kernels are preserved by polynomial parametric transformations, we
define a kernelization hardness hierarchy, akin to the M- and W-hierarchy of
ordinary parameterized complexity, by the PPT-closure of problems that seem
likely to be fundamentally hard for efficient Turing kernelization. We find
that several previously considered problems are complete for our fundamental
hardness class, including Min Ones d-SAT(k), Binary NDTM Halting(k), Connected
Vertex Cover(k), and Clique(k log n), the clique problem parameterized by k log
n
A Note on Clustering Aggregation
We consider the clustering aggregation problem in which we are given a set of
clusterings and want to find an aggregated clustering which minimizes the sum
of mismatches to the input clusterings. In the binary case (each clustering is
a bipartition) this problem was known to be NP-hard under Turing reduction. We
strengthen this result by providing a polynomial-time many-one reduction. Our
result also implies that no -time algorithm exists
for any clustering instance with elements, unless the Exponential Time
Hypothesis fails. On the positive side, we show that the problem is
fixed-parameter tractable with respect to the number of input clusterings
- …