25 research outputs found

    SOS Is Not Obviously Automatizable, Even Approximately

    Get PDF
    Suppose we want to minimize a polynomial p(x) = p(x_1,...,x_n), subject to some polynomial constraints q_1(x),...,q_m(x) >_ 0, using the Sum-of-Squares (SOS) SDP hierarachy. Assume we are in the "explicitly bounded" ("Archimedean") case where the constraints include x_i^2 <_ 1 for all 1 <_ i <_ n. It is often stated that the degree-d version of the SOS hierarchy can be solved, to high accuracy, in time n^O(d). Indeed, I myself have stated this in several previous works. The point of this note is to state (or remind the reader) that this is not obviously true. The difficulty comes not from the "r" in the Ellipsoid Algorithm, but from the "R"; a priori, we only know an exponential upper bound on the number of bits needed to write down the SOS solution. An explicit example is given of a degree-2 SOS program illustrating the difficulty

    Short Proofs Are Hard to Find

    Get PDF
    We obtain a streamlined proof of an important result by Alekhnovich and Razborov [Michael Alekhnovich and Alexander A. Razborov, 2008], showing that it is hard to automatize both tree-like and general Resolution. Under a different assumption than [Michael Alekhnovich and Alexander A. Razborov, 2008], our simplified proof gives improved bounds: we show under ETH that these proof systems are not automatizable in time n^f(n), whenever f(n) = o(log^{1/7 - epsilon} log n) for any epsilon > 0. Previously non-automatizability was only known for f(n) = O(1). Our proof also extends fairly straightforwardly to prove similar hardness results for PCR and Res(r)

    Sum of squares lower bounds for refuting any CSP

    Full text link
    Let P:{0,1}k{0,1}P:\{0,1\}^k \to \{0,1\} be a nontrivial kk-ary predicate. Consider a random instance of the constraint satisfaction problem CSP(P)\mathrm{CSP}(P) on nn variables with Δn\Delta n constraints, each being PP applied to kk randomly chosen literals. Provided the constraint density satisfies Δ1\Delta \gg 1, such an instance is unsatisfiable with high probability. The \emph{refutation} problem is to efficiently find a proof of unsatisfiability. We show that whenever the predicate PP supports a tt-\emph{wise uniform} probability distribution on its satisfying assignments, the sum of squares (SOS) algorithm of degree d=Θ(nΔ2/(t1)logΔ)d = \Theta(\frac{n}{\Delta^{2/(t-1)} \log \Delta}) (which runs in time nO(d)n^{O(d)}) \emph{cannot} refute a random instance of CSP(P)\mathrm{CSP}(P). In particular, the polynomial-time SOS algorithm requires Ω~(n(t+1)/2)\widetilde{\Omega}(n^{(t+1)/2}) constraints to refute random instances of CSP(P)(P) when PP supports a tt-wise uniform distribution on its satisfying assignments. Together with recent work of Lee et al. [LRS15], our result also implies that \emph{any} polynomial-size semidefinite programming relaxation for refutation requires at least Ω~(n(t+1)/2)\widetilde{\Omega}(n^{(t+1)/2}) constraints. Our results (which also extend with no change to CSPs over larger alphabets) subsume all previously known lower bounds for semialgebraic refutation of random CSPs. For every constraint predicate~PP, they give a three-way hardness tradeoff between the density of constraints, the SOS degree (hence running time), and the strength of the refutation. By recent algorithmic results of Allen et al. [AOW15] and Raghavendra et al. [RRS16], this full three-way tradeoff is \emph{tight}, up to lower-order factors.Comment: 39 pages, 1 figur

    New Dependencies of Hierarchies in Polynomial Optimization

    Full text link
    We compare four key hierarchies for solving Constrained Polynomial Optimization Problems (CPOP): Sum of Squares (SOS), Sum of Diagonally Dominant Polynomials (SDSOS), Sum of Nonnegative Circuits (SONC), and the Sherali Adams (SA) hierarchies. We prove a collection of dependencies among these hierarchies both for general CPOPs and for optimization problems on the Boolean hypercube. Key results include for the general case that the SONC and SOS hierarchy are polynomially incomparable, while SDSOS is contained in SONC. A direct consequence is the non-existence of a Putinar-like Positivstellensatz for SDSOS. On the Boolean hypercube, we show as a main result that Schm\"udgen-like versions of the hierarchies SDSOS*, SONC*, and SA* are polynomially equivalent. Moreover, we show that SA* is contained in any Schm\"udgen-like hierarchy that provides a O(n) degree bound.Comment: 26 pages, 4 figure

    On the Bit Complexity of Sum-of-Squares Proofs

    Get PDF
    It has often been claimed in recent papers that one can find a degree d Sum-of-Squares proof if one exists via the Ellipsoid algorithm. In a recent paper, Ryan O\u27Donnell notes this widely quoted claim is not necessarily true. He presents an example of a polynomial system with bounded coefficients that admits low-degree proofs of non-negativity, but these proofs necessarily involve numbers with an exponential number of bits, causing the Ellipsoid algorithm to take exponential time. In this paper we obtain both positive and negative results on the bit complexity of SoS proofs. First, we propose a sufficient condition on a polynomial system that implies a bound on the coefficients in an SoS proof. We demonstrate that this sufficient condition is applicable for common use-cases of the SoS algorithm, such as Max-CSP, Balanced Separator, Max-Clique, Max-Bisection, and Unit-Vector constraints. On the negative side, O\u27Donnell asked whether every polynomial system containing Boolean constraints admits proofs of polynomial bit complexity. We answer this question in the negative, giving a counterexample system and non-negative polynomial which has degree two SoS proofs, but no SoS proof with small coefficients until degree sqrt(n)

    A note on the computational complexity of the moment-SOS hierarchy for polynomial optimization

    Full text link
    The moment-sum-of-squares (moment-SOS) hierarchy is one of the most celebrated and widely applied methods for approximating the minimum of an n-variate polynomial over a feasible region defined by polynomial (in)equalities. A key feature of the hierarchy is that, at a fixed level, it can be formulated as a semidefinite program of size polynomial in the number of variables n. Although this suggests that it may therefore be computed in polynomial time, this is not necessarily the case. Indeed, as O'Donnell (2017) and later Raghavendra & Weitz (2017) show, there exist examples where the sos-representations used in the hierarchy have exponential bit-complexity. We study the computational complexity of the moment-SOS hierarchy, complementing and expanding upon earlier work of Raghavendra & Weitz (2017). In particular, we establish algebraic and geometric conditions under which polynomial-time computation is guaranteed to be possible.Comment: 10 page

    Feasible Interpolation for Polynomial Calculus and Sums-Of-Squares

    Get PDF
    We prove that both Polynomial Calculus and Sums-of-Squares proof systems admit a strong form of feasible interpolation property for sets of polynomial equality constraints. Precisely, given two sets P(x,z) and Q(y,z) of equality constraints, a refutation ? of P(x,z) ? Q(y,z), and any assignment a to the variables z, one can find a refutation of P(x,a) or a refutation of Q(y,a) in time polynomial in the length of the bit-string encoding the refutation ?. For Sums-of-Squares we rely on the use of Boolean axioms, but for Polynomial Calculus we do not assume their presence

    Ideal Membership Problem and a Majority Polymorphism over the Ternary Domain

    Get PDF
    corecore