14,265 research outputs found
Fast Algorithms for Parameterized Problems with Relaxed Disjointness Constraints
In parameterized complexity, it is a natural idea to consider different
generalizations of classic problems. Usually, such generalization are obtained
by introducing a "relaxation" variable, where the original problem corresponds
to setting this variable to a constant value. For instance, the problem of
packing sets of size at most into a given universe generalizes the Maximum
Matching problem, which is recovered by taking . Most often, the
complexity of the problem increases with the relaxation variable, but very
recently Abasi et al. have given a surprising example of a problem ---
-Simple -Path --- that can be solved by a randomized algorithm with
running time . That is, the complexity of the
problem decreases with . In this paper we pursue further the direction
sketched by Abasi et al. Our main contribution is a derandomization tool that
provides a deterministic counterpart of the main technical result of Abasi et
al.: the algorithm for -Monomial
Detection, which is the problem of finding a monomial of total degree and
individual degrees at most in a polynomial given as an arithmetic circuit.
Our technique works for a large class of circuits, and in particular it can be
used to derandomize the result of Abasi et al. for -Simple -Path. On our
way to this result we introduce the notion of representative sets for
multisets, which may be of independent interest. Finally, we give two more
examples of problems that were already studied in the literature, where the
same relaxation phenomenon happens. The first one is a natural relaxation of
the Set Packing problem, where we allow the packed sets to overlap at each
element at most times. The second one is Degree Bounded Spanning Tree,
where we seek for a spanning tree of the graph with a small maximum degree
More Applications of the d-Neighbor Equivalence: Connectivity and Acyclicity Constraints
In this paper, we design a framework to obtain efficient algorithms for several problems with a global constraint (acyclicity or connectivity) such as Connected Dominating Set, Node Weighted Steiner Tree, Maximum Induced Tree, Longest Induced Path, and Feedback Vertex Set. For all these problems, we obtain 2^O(k)* n^O(1), 2^O(k log(k))* n^O(1), 2^O(k^2) * n^O(1) and n^O(k) time algorithms parameterized respectively by clique-width, Q-rank-width, rank-width and maximum induced matching width. Our approach simplifies and unifies the known algorithms for each of the parameters and match asymptotically also the running time of the best algorithms for basic NP-hard problems such as Vertex Cover and Dominating Set. Our framework is based on the d-neighbor equivalence defined in [Bui-Xuan, Telle and Vatshelle, TCS 2013]. The results we obtain highlight the importance and the generalizing power of this equivalence relation on width measures. We also prove that this equivalence relation could be useful for Max Cut: a W[1]-hard problem parameterized by clique-width. For this latter problem, we obtain n^O(k), n^O(k) and n^(2^O(k)) time algorithm parameterized by clique-width, Q-rank-width and rank-width
Parameterized Algorithms for Graph Partitioning Problems
We study a broad class of graph partitioning problems, where each problem is
specified by a graph , and parameters and . We seek a subset
of size , such that is at most
(or at least) , where are constants
defining the problem, and are the cardinalities of the edge sets
having both endpoints, and exactly one endpoint, in , respectively. This
class of fixed cardinality graph partitioning problems (FGPP) encompasses Max
-Cut, Min -Vertex Cover, -Densest Subgraph, and -Sparsest
Subgraph.
Our main result is an algorithm for any problem in
this class, where is the maximum degree in the input graph.
This resolves an open question posed by Bonnet et al. [IPEC 2013]. We obtain
faster algorithms for certain subclasses of FGPPs, parameterized by , or by
. In particular, we give an time algorithm for Max
-Cut, thus improving significantly the best known time
algorithm
Speeding-up Dynamic Programming with Representative Sets - An Experimental Evaluation of Algorithms for Steiner Tree on Tree Decompositions
Dynamic programming on tree decompositions is a frequently used approach to
solve otherwise intractable problems on instances of small treewidth. In recent
work by Bodlaender et al., it was shown that for many connectivity problems,
there exist algorithms that use time, linear in the number of vertices, and
single exponential in the width of the tree decomposition that is used. The
central idea is that it suffices to compute representative sets, and these can
be computed efficiently with help of Gaussian elimination.
In this paper, we give an experimental evaluation of this technique for the
Steiner Tree problem. A comparison of the classic dynamic programming algorithm
and the improved dynamic programming algorithm that employs the table reduction
shows that the new approach gives significant improvements on the running time
of the algorithm and the size of the tables computed by the dynamic programming
algorithm, and thus that the rank based approach from Bodlaender et al. does
not only give significant theoretical improvements but also is a viable
approach in a practical setting, and showcases the potential of exploiting the
idea of representative sets for speeding up dynamic programming algorithms
More applications of the d-neighbor equivalence: acyclicity and connectivity constraints
In this paper, we design a framework to obtain efficient algorithms for
several problems with a global constraint (acyclicity or connectivity) such as
Connected Dominating Set, Node Weighted Steiner Tree, Maximum Induced Tree,
Longest Induced Path, and Feedback Vertex Set. We design a meta-algorithm that
solves all these problems and whose running time is upper bounded by
, , and where is respectively the clique-width,
-rank-width, rank-width and maximum induced matching width of a
given decomposition. Our meta-algorithm simplifies and unifies the known
algorithms for each of the parameters and its running time matches
asymptotically also the running times of the best known algorithms for basic
NP-hard problems such as Vertex Cover and Dominating Set. Our framework is
based on the -neighbor equivalence defined in [Bui-Xuan, Telle and
Vatshelle, TCS 2013]. The results we obtain highlight the importance of this
equivalence relation on the algorithmic applications of width measures.
We also prove that our framework could be useful for -hard problems
parameterized by clique-width such as Max Cut and Maximum Minimal Cut. For
these latter problems, we obtain , and time
algorithms where is respectively the clique-width, the
-rank-width and the rank-width of the input graph
Fair Knapsack
We study the following multiagent variant of the knapsack problem. We are
given a set of items, a set of voters, and a value of the budget; each item is
endowed with a cost and each voter assigns to each item a certain value. The
goal is to select a subset of items with the total cost not exceeding the
budget, in a way that is consistent with the voters' preferences. Since the
preferences of the voters over the items can vary significantly, we need a way
of aggregating these preferences, in order to select the socially best valid
knapsack. We study three approaches to aggregating voters' preferences, which
are motivated by the literature on multiwinner elections and fair allocation.
This way we introduce the concepts of individually best, diverse, and fair
knapsack. We study the computational complexity (including parameterized
complexity, and complexity under restricted domains) of the aforementioned
multiagent variants of knapsack.Comment: Extended abstract will appear in Proc. of 33rd AAAI 201
- …