26 research outputs found

    On the NP-Hardness of Approximating Ordering Constraint Satisfaction Problems

    Full text link
    We show improved NP-hardness of approximating Ordering Constraint Satisfaction Problems (OCSPs). For the two most well-studied OCSPs, Maximum Acyclic Subgraph and Maximum Betweenness, we prove inapproximability of 14/15+ϵ14/15+\epsilon and 1/2+ϵ1/2+\epsilon. An OCSP is said to be approximation resistant if it is hard to approximate better than taking a uniformly random ordering. We prove that the Maximum Non-Betweenness Problem is approximation resistant and that there are width-mm approximation-resistant OCSPs accepting only a fraction 1/(m/2)!1 / (m/2)! of assignments. These results provide the first examples of approximation-resistant OCSPs subject only to P ≠\neq \NP

    Separating the NP-Hardness of the Grothendieck Problem from the Little-Grothendieck Problem

    Get PDF

    The Quest for Strong Inapproximability Results with Perfect Completeness

    Get PDF
    The Unique Games Conjecture (UGC) has pinned down the approximability of all constraint satisfaction problems (CSPs), showing that a natural semidefinite programming relaxation offers the optimal worst-case approximation ratio for any CSP. This elegant picture, however, does not apply for CSP instances that are perfectly satisfiable, due to the imperfect completeness inherent in the UGC. For the important case when the input CSP instance admits a satisfying assignment, it therefore remains wide open to understand how well it can be approximated. This work is motivated by the pursuit of a better understanding of the inapproximability of perfectly satisfiable instances of CSPs. Our main conceptual contribution is the formulation of a (hypergraph) version of Label Cover which we call "V label cover." Assuming a conjecture concerning the inapproximability of V label cover on perfectly satisfiable instances, we prove the following implications: * There is an absolute constant c0 such that for k >= 3, given a satisfiable instance of Boolean k-CSP, it is hard to find an assignment satisfying more than c0 k^2/2^k fraction of the constraints. * Given a k-uniform hypergraph, k >= 2, for all epsilon > 0, it is hard to tell if it is q-strongly colorable or has no independent set with an epsilon fraction of vertices, where q = ceiling[k + sqrt(k) - 0.5]. * Given a k-uniform hypergraph, k >= 3, for all epsilon > 0, it is hard to tell if it is (k-1)-rainbow colorable or has no independent set with an epsilon fraction of vertices. We further supplement the above results with a proof that an ``almost Unique\u27\u27 version of Label Cover can be approximated within a constant factor on satisfiable instances

    A Greedy Algorithm for Subspace Approximation Problem

    Get PDF
    In the subspace approximation problem, given m points in R^{n} and an integer k <= n, the goal is to find a k-dimension subspace of R^{n} that minimizes the l_{p}-norm of the Euclidean distances to the given points. This problem generalizes several subspace approximation problems and has applications from statistics, machine learning, signal processing to biology. Deshpande et al. [Deshpande et al., 2011] gave a randomized O(sqrt{p})-approximation and this bound is proved to be tight assuming NP != P by Guruswami et al. [Guruswami et al., 2016]. It is an intriguing question of determining the performance guarantee of deterministic algorithms for the problem. In this paper, we present a simple deterministic O(sqrt{p})-approximation algorithm with also a simple analysis. That definitely settles the status of the problem in term of approximation up to a constant factor. Besides, the simplicity of the algorithm makes it practically appealing

    Improved Hardness for Cut, Interdiction, and Firefighter Problems

    Get PDF
    We study variants of the classic s-t cut problem and prove the following improved hardness results assuming the Unique Games Conjecture (UGC). * For Length-Bounded Cut and Shortest Path Interdiction, we show that both problems are hard to approximate within any constant factor, even if we allow bicriteria approximation. If we want to cut vertices or the graph is directed, our hardness ratio for Length-Bounded Cut matches the best approximation ratio up to a constant. Previously, the best hardness ratio was 1.1377 for Length-Bounded Cut and 2 for Shortest Path Interdiction. * For any constant k >= 2 and epsilon > 0, we show that Directed Multicut with k source-sink pairs is hard to approximate within a factor k - epsilon. This matches the trivial k-approximation algorithm. By a simple reduction, our result for k = 2 implies that Directed Multiway Cut with two terminals (also known as s-t Bicut} is hard to approximate within a factor 2 - epsilon, matching the trivial 2-approximation algorithm. * Assuming a variant of the UGC (implied by another variant of Bansal and Khot), we prove that it is hard to approximate Resource Minimization Fire Containment within any constant factor. Previously, the best hardness ratio was 2. For directed layered graphs with b layers, our hardness ratio Omega(log b) matches the best approximation algorithm. Our results are based on a general method of converting an integrality gap instance to a length-control dictatorship test for variants of the s-t cut problem, which may be useful for other problems

    Tight hardness of the non-commutative Grothendieck problem

    Get PDF
    We prove that for any ε>0\varepsilon > 0 it is NP-hard to approximate the non-commutative Grothendieck problem to within a factor 1/2+ε1/2 + \varepsilon, which matches the approximation ratio of the algorithm of Naor, Regev, and Vidick (STOC'13). Our proof uses an embedding of ℓ2\ell_2 into the space of matrices endowed with the trace norm with the property that the image of standard basis vectors is longer than that of unit vectors with no large coordinates

    Algorithms and Hardness for Robust Subspace Recovery

    Full text link
    We consider a fundamental problem in unsupervised learning called \emph{subspace recovery}: given a collection of mm points in Rn\mathbb{R}^n, if many but not necessarily all of these points are contained in a dd-dimensional subspace TT can we find it? The points contained in TT are called {\em inliers} and the remaining points are {\em outliers}. This problem has received considerable attention in computer science and in statistics. Yet efficient algorithms from computer science are not robust to {\em adversarial} outliers, and the estimators from robust statistics are hard to compute in high dimensions. Are there algorithms for subspace recovery that are both robust to outliers and efficient? We give an algorithm that finds TT when it contains more than a dn\frac{d}{n} fraction of the points. Hence, for say d=n/2d = n/2 this estimator is both easy to compute and well-behaved when there are a constant fraction of outliers. We prove that it is Small Set Expansion hard to find TT when the fraction of errors is any larger, thus giving evidence that our estimator is an {\em optimal} compromise between efficiency and robustness. As it turns out, this basic problem has a surprising number of connections to other areas including small set expansion, matroid theory and functional analysis that we make use of here.Comment: Appeared in Proceedings of COLT 201
    corecore