41 research outputs found

    Approximability and proof complexity

    Full text link
    This work is concerned with the proof-complexity of certifying that optimization problems do \emph{not} have good solutions. Specifically we consider bounded-degree "Sum of Squares" (SOS) proofs, a powerful algebraic proof system introduced in 1999 by Grigoriev and Vorobjov. Work of Shor, Lasserre, and Parrilo shows that this proof system is automatizable using semidefinite programming (SDP), meaning that any nn-variable degree-dd proof can be found in time nO(d)n^{O(d)}. Furthermore, the SDP is dual to the well-known Lasserre SDP hierarchy, meaning that the "d/2d/2-round Lasserre value" of an optimization problem is equal to the best bound provable using a degree-dd SOS proof. These ideas were exploited in a recent paper by Barak et al.\ (STOC 2012) which shows that the known "hard instances" for the Unique-Games problem are in fact solved close to optimally by a constant level of the Lasserre SDP hierarchy. We continue the study of the power of SOS proofs in the context of difficult optimization problems. In particular, we show that the Balanced-Separator integrality gap instances proposed by Devanur et al.\ can have their optimal value certified by a degree-4 SOS proof. The key ingredient is an SOS proof of the KKL Theorem. We also investigate the extent to which the Khot--Vishnoi Max-Cut integrality gap instances can have their optimum value certified by an SOS proof. We show they can be certified to within a factor .952 (>.878> .878) using a constant-degree proof. These investigations also raise an interesting mathematical question: is there a constant-degree SOS proof of the Central Limit Theorem?Comment: 34 page

    Global Cardinality Constraints Make Approximating Some Max-2-CSPs Harder

    Get PDF
    Assuming the Unique Games Conjecture, we show that existing approximation algorithms for some Boolean Max-2-CSPs with cardinality constraints are optimal. In particular, we prove that Max-Cut with cardinality constraints is UG-hard to approximate within ~~0.858, and that Max-2-Sat with cardinality constraints is UG-hard to approximate within ~~0.929. In both cases, the previous best hardness results were the same as the hardness of the corresponding unconstrained Max-2-CSP (~~0.878 for Max-Cut, and ~~0.940 for Max-2-Sat). The hardness for Max-2-Sat applies to monotone Max-2-Sat instances, meaning that we also obtain tight inapproximability for the Max-k-Vertex-Cover problem

    Tight Size-Degree Bounds for Sums-of-Squares Proofs

    Full text link
    We exhibit families of 44-CNF formulas over nn variables that have sums-of-squares (SOS) proofs of unsatisfiability of degree (a.k.a. rank) dd but require SOS proofs of size nΩ(d)n^{\Omega(d)} for values of d=d(n)d = d(n) from constant all the way up to nδn^{\delta} for some universal constantδ\delta. This shows that the nO(d)n^{O(d)} running time obtained by using the Lasserre semidefinite programming relaxations to find degree-dd SOS proofs is optimal up to constant factors in the exponent. We establish this result by combining NP\mathsf{NP}-reductions expressible as low-degree SOS derivations with the idea of relativizing CNF formulas in [Kraj\'i\v{c}ek '04] and [Dantchev and Riis'03], and then applying a restriction argument as in [Atserias, M\"uller, and Oliva '13] and [Atserias, Lauria, and Nordstr\"om '14]. This yields a generic method of amplifying SOS degree lower bounds to size lower bounds, and also generalizes the approach in [ALN14] to obtain size lower bounds for the proof systems resolution, polynomial calculus, and Sherali-Adams from lower bounds on width, degree, and rank, respectively

    SOS Is Not Obviously Automatizable, Even Approximately

    Get PDF
    Suppose we want to minimize a polynomial p(x) = p(x_1,...,x_n), subject to some polynomial constraints q_1(x),...,q_m(x) >_ 0, using the Sum-of-Squares (SOS) SDP hierarachy. Assume we are in the "explicitly bounded" ("Archimedean") case where the constraints include x_i^2 <_ 1 for all 1 <_ i <_ n. It is often stated that the degree-d version of the SOS hierarchy can be solved, to high accuracy, in time n^O(d). Indeed, I myself have stated this in several previous works. The point of this note is to state (or remind the reader) that this is not obviously true. The difficulty comes not from the "r" in the Ellipsoid Algorithm, but from the "R"; a priori, we only know an exponential upper bound on the number of bits needed to write down the SOS solution. An explicit example is given of a degree-2 SOS program illustrating the difficulty

    Towards a better approximation for sparsest cut?

    Full text link
    We give a new (1+ϵ)(1+\epsilon)-approximation for sparsest cut problem on graphs where small sets expand significantly more than the sparsest cut (sets of size n/rn/r expand by a factor lognlogr\sqrt{\log n\log r} bigger, for some small rr; this condition holds for many natural graph families). We give two different algorithms. One involves Guruswami-Sinop rounding on the level-rr Lasserre relaxation. The other is combinatorial and involves a new notion called {\em Small Set Expander Flows} (inspired by the {\em expander flows} of ARV) which we show exists in the input graph. Both algorithms run in time 2O(r)poly(n)2^{O(r)} \mathrm{poly}(n). We also show similar approximation algorithms in graphs with genus gg with an analogous local expansion condition. This is the first algorithm we know of that achieves (1+ϵ)(1+\epsilon)-approximation on such general family of graphs
    corecore