24 research outputs found
Hardness of Vertex Deletion and Project Scheduling
Assuming the Unique Games Conjecture, we show strong inapproximability
results for two natural vertex deletion problems on directed graphs: for any
integer and arbitrary small , the Feedback Vertex Set
problem and the DAG Vertex Deletion problem are inapproximable within a factor
even on graphs where the vertices can be almost partitioned into
solutions. This gives a more structured and therefore stronger UGC-based
hardness result for the Feedback Vertex Set problem that is also simpler
(albeit using the "It Ain't Over Till It's Over" theorem) than the previous
hardness result.
In comparison to the classical Feedback Vertex Set problem, the DAG Vertex
Deletion problem has received little attention and, although we think it is a
natural and interesting problem, the main motivation for our inapproximability
result stems from its relationship with the classical Discrete Time-Cost
Tradeoff Problem. More specifically, our results imply that the deadline
version is NP-hard to approximate within any constant assuming the Unique Games
Conjecture. This explains the difficulty in obtaining good approximation
algorithms for that problem and further motivates previous alternative
approaches such as bicriteria approximations.Comment: 18 pages, 1 figur
Testing Consumer Rationality using Perfect Graphs and Oriented Discs
Given a consumer data-set, the axioms of revealed preference proffer a binary
test for rational behaviour. A natural (non-binary) measure of the degree of
rationality exhibited by the consumer is the minimum number of data points
whose removal induces a rationalisable data-set.We study the computational
complexity of the resultant consumer rationality problem in this paper. This
problem is, in the worst case, equivalent (in terms of approximation) to the
directed feedback vertex set problem. Our main result is to obtain an exact
threshold on the number of commodities that separates easy cases and hard
cases. Specifically, for two-commodity markets the consumer rationality problem
is polynomial time solvable; we prove this via a reduction to the vertex cover
problem on perfect graphs. For three-commodity markets, however, the problem is
NP-complete; we prove thisusing a reduction from planar 3-SAT that is based
upon oriented-disc drawings
On the Approximability of Digraph Ordering
Given an n-vertex digraph D = (V, A) the Max-k-Ordering problem is to compute
a labeling maximizing the number of forward edges, i.e.
edges (u,v) such that (u) < (v). For different values of k, this
reduces to Maximum Acyclic Subgraph (k=n), and Max-Dicut (k=2). This work
studies the approximability of Max-k-Ordering and its generalizations,
motivated by their applications to job scheduling with soft precedence
constraints. We give an LP rounding based 2-approximation algorithm for
Max-k-Ordering for any k={2,..., n}, improving on the known
2k/(k-1)-approximation obtained via random assignment. The tightness of this
rounding is shown by proving that for any k={2,..., n} and constant
, Max-k-Ordering has an LP integrality gap of 2 -
for rounds of the
Sherali-Adams hierarchy.
A further generalization of Max-k-Ordering is the restricted maximum acyclic
subgraph problem or RMAS, where each vertex v has a finite set of allowable
labels . We prove an LP rounding based
approximation for it, improving on the
approximation recently given by Grandoni et al.
(Information Processing Letters, Vol. 115(2), Pages 182-185, 2015). In fact,
our approximation algorithm also works for a general version where the
objective counts the edges which go forward by at least a positive offset
specific to each edge.
The minimization formulation of digraph ordering is DAG edge deletion or
DED(k), which requires deleting the minimum number of edges from an n-vertex
directed acyclic graph (DAG) to remove all paths of length k. We show that
both, the LP relaxation and a local ratio approach for DED(k) yield
k-approximation for any .Comment: 21 pages, Conference version to appear in ESA 201
Inapproximability of Maximum Edge Biclique, Maximum Balanced Biclique and Minimum k-Cut from the Small Set Expansion Hypothesis
The Small Set Expansion Hypothesis (SSEH) is a conjecture which roughly states that it is NP-hard to distinguish between a graph with a small set of vertices whose expansion is almost zero and one in which all small sets of vertices have expansion almost one. In this work, we prove conditional inapproximability results for the following graph problems based on this hypothesis:
- Maximum Edge Biclique (MEB): given a bipartite graph G, find a complete bipartite subgraph of G with maximum number of edges. We show that, assuming SSEH and that NP != BPP, no polynomial time algorithm gives n^{1 - epsilon}-approximation for MEB for every constant epsilon > 0.
- Maximum Balanced Biclique (MBB): given a bipartite graph G, find a balanced complete bipartite subgraph of G with maximum number of vertices. Similar to MEB, we prove n^{1 - epsilon} ratio inapproximability for MBB for every epsilon > 0, assuming SSEH and that NP != BPP.
- Minimum k-Cut: given a weighted graph G, find a set of edges with minimum total weight whose removal splits the graph into k components. We prove that this problem is NP-hard to approximate to within (2 - epsilon) factor of the optimum for every epsilon > 0, assuming SSEH.
The ratios in our results are essentially tight since trivial algorithms give n-approximation to both MEB and MBB and 2-approximation algorithms are known for Minimum k-Cut [Saran and Vazirani, SIAM J. Comput., 1995].
Our first two results are proved by combining a technique developed by Raghavendra, Steurer and Tulsiani [Raghavendra et al., CCC, 2012] to avoid locality of gadget reductions with a generalization of Bansal and Khot\u27s long code test [Bansal and Khot, FOCS, 2009] whereas our last result is shown via an elementary reduction
Inapproximability of Maximum Biclique Problems, Minimum -Cut and Densest At-Least--Subgraph from the Small Set Expansion Hypothesis
The Small Set Expansion Hypothesis (SSEH) is a conjecture which roughly
states that it is NP-hard to distinguish between a graph with a small subset of
vertices whose edge expansion is almost zero and one in which all small subsets
of vertices have expansion almost one. In this work, we prove inapproximability
results for the following graph problems based on this hypothesis:
- Maximum Edge Biclique (MEB): given a bipartite graph , find a complete
bipartite subgraph of with maximum number of edges.
- Maximum Balanced Biclique (MBB): given a bipartite graph , find a
balanced complete bipartite subgraph of with maximum number of vertices.
- Minimum -Cut: given a weighted graph , find a set of edges with
minimum total weight whose removal partitions into connected
components.
- Densest At-Least--Subgraph (DALS): given a weighted graph , find a
set of at least vertices such that the induced subgraph on has
maximum density (the ratio between the total weight of edges and the number of
vertices).
We show that, assuming SSEH and NP BPP, no polynomial time
algorithm gives -approximation for MEB or MBB for every
constant . Moreover, assuming SSEH, we show that it is NP-hard
to approximate Minimum -Cut and DALS to within factor
of the optimum for every constant .
The ratios in our results are essentially tight since trivial algorithms give
-approximation to both MEB and MBB and efficient -approximation
algorithms are known for Minimum -Cut [SV95] and DALS [And07, KS09].
Our first result is proved by combining a technique developed by Raghavendra
et al. [RST12] to avoid locality of gadget reductions with a generalization of
Bansal and Khot's long code test [BK09] whereas our second result is shown via
elementary reductions.Comment: A preliminary version of this work will appear at ICALP 2017 under a
different title "Inapproximability of Maximum Edge Biclique, Maximum Balanced
Biclique and Minimum k-Cut from the Small Set Expansion Hypothesis
Improved Hardness for Cut, Interdiction, and Firefighter Problems
We study variants of the classic s-t cut problem and prove the following improved hardness results assuming the Unique Games Conjecture (UGC).
* For Length-Bounded Cut and Shortest Path Interdiction, we show that both problems are hard to approximate within any constant factor, even if we allow bicriteria approximation. If we want to cut vertices or the graph is directed, our hardness ratio for Length-Bounded Cut matches the best approximation ratio up to a constant. Previously, the best hardness ratio was 1.1377 for Length-Bounded Cut and 2 for Shortest Path Interdiction.
* For any constant k >= 2 and epsilon > 0, we show that Directed Multicut with k source-sink pairs is hard to approximate within a factor k - epsilon. This matches the trivial k-approximation algorithm. By a simple reduction, our result for k = 2 implies that Directed Multiway Cut with two terminals (also known as s-t Bicut} is hard to approximate within a factor 2 - epsilon, matching the trivial 2-approximation algorithm.
* Assuming a variant of the UGC (implied by another variant of Bansal and Khot), we prove that it is hard to approximate Resource Minimization Fire Containment within any constant factor. Previously, the best hardness ratio was 2. For directed layered graphs with b layers, our hardness ratio Omega(log b) matches the best approximation algorithm.
Our results are based on a general method of converting an integrality gap instance to a length-control dictatorship test for variants of the s-t cut problem, which may be useful for other problems
Hardness of Graph Pricing through Generalized Max-Dicut
The Graph Pricing problem is among the fundamental problems whose
approximability is not well-understood. While there is a simple combinatorial
1/4-approximation algorithm, the best hardness result remains at 1/2 assuming
the Unique Games Conjecture (UGC). We show that it is NP-hard to approximate
within a factor better than 1/4 under the UGC, so that the simple combinatorial
algorithm might be the best possible. We also prove that for any , there exists such that the integrality gap of
-rounds of the Sherali-Adams hierarchy of linear programming for
Graph Pricing is at most 1/2 + .
This work is based on the effort to view the Graph Pricing problem as a
Constraint Satisfaction Problem (CSP) simpler than the standard and complicated
formulation. We propose the problem called Generalized Max-Dicut(), which
has a domain size for every . Generalized Max-Dicut(1) is
well-known Max-Dicut. There is an approximation-preserving reduction from
Generalized Max-Dicut on directed acyclic graphs (DAGs) to Graph Pricing, and
both our results are achieved through this reduction. Besides its connection to
Graph Pricing, the hardness of Generalized Max-Dicut is interesting in its own
right since in most arity two CSPs studied in the literature, SDP-based
algorithms perform better than LP-based or combinatorial algorithms --- for
this arity two CSP, a simple combinatorial algorithm does the best.Comment: 28 page
Approximating Cumulative Pebbling Cost Is Unique Games Hard
The cumulative pebbling complexity of a directed acyclic graph is defined
as , where the minimum is taken over all
legal (parallel) black pebblings of and denotes the number of
pebbles on the graph during round . Intuitively, captures
the amortized Space-Time complexity of pebbling copies of in parallel.
The cumulative pebbling complexity of a graph is of particular interest in
the field of cryptography as is tightly related to the
amortized Area-Time complexity of the Data-Independent Memory-Hard Function
(iMHF) [AS15] defined using a constant indegree directed acyclic
graph (DAG) and a random oracle . A secure iMHF should have
amortized Space-Time complexity as high as possible, e.g., to deter brute-force
password attacker who wants to find such that . Thus, to
analyze the (in)security of a candidate iMHF , it is crucial to
estimate the value but currently, upper and lower bounds for
leading iMHF candidates differ by several orders of magnitude. Blocki and Zhou
recently showed that it is -Hard to compute , but
their techniques do not even rule out an efficient
-approximation algorithm for any constant . We
show that for any constant , it is Unique Games hard to approximate
to within a factor of .
(See the paper for the full abstract.)Comment: 28 pages, updated figures and corrected typo