1,022 research outputs found

    A Tensor Analogy of Yuan's Theorem of the Alternative and Polynomial Optimization with Sign structure

    Full text link
    Yuan's theorem of the alternative is an important theoretical tool in optimization, which provides a checkable certificate for the infeasibility of a strict inequality system involving two homogeneous quadratic functions. In this paper, we provide a tractable extension of Yuan's theorem of the alternative to the symmetric tensor setting. As an application, we establish that the optimal value of a class of nonconvex polynomial optimization problems with suitable sign structure (or more explicitly, with essentially non-positive coefficients) can be computed by a related convex conic programming problem, and the optimal solution of these nonconvex polynomial optimization problems can be recovered from the corresponding solution of the convex conic programming problem. Moreover, we obtain that this class of nonconvex polynomial optimization problems enjoy exact sum-of-squares relaxation, and so, can be solved via a single semidefinite programming problem.Comment: acceted by Journal of Optimization Theory and its application, UNSW preprint, 22 page

    Exploiting symmetries in SDP-relaxations for polynomial optimization

    Full text link
    In this paper we study various approaches for exploiting symmetries in polynomial optimization problems within the framework of semi definite programming relaxations. Our special focus is on constrained problems especially when the symmetric group is acting on the variables. In particular, we investigate the concept of block decomposition within the framework of constrained polynomial optimization problems, show how the degree principle for the symmetric group can be computationally exploited and also propose some methods to efficiently compute in the geometric quotient.Comment: (v3) Minor revision. To appear in Math. of Operations Researc

    Polynomial Optimization with Applications to Stability Analysis and Control - Alternatives to Sum of Squares

    Full text link
    In this paper, we explore the merits of various algorithms for polynomial optimization problems, focusing on alternatives to sum of squares programming. While we refer to advantages and disadvantages of Quantifier Elimination, Reformulation Linear Techniques, Blossoming and Groebner basis methods, our main focus is on algorithms defined by Polya's theorem, Bernstein's theorem and Handelman's theorem. We first formulate polynomial optimization problems as verifying the feasibility of semi-algebraic sets. Then, we discuss how Polya's algorithm, Bernstein's algorithm and Handelman's algorithm reduce the intractable problem of feasibility of semi-algebraic sets to linear and/or semi-definite programming. We apply these algorithms to different problems in robust stability analysis and stability of nonlinear dynamical systems. As one contribution of this paper, we apply Polya's algorithm to the problem of H_infinity control of systems with parametric uncertainty. Numerical examples are provided to compare the accuracy of these algorithms with other polynomial optimization algorithms in the literature.Comment: AIMS Journal of Discrete and Continuous Dynamical Systems - Series

    Tight Sum-of-Squares lower bounds for binary polynomial optimization problems

    Get PDF
    We give two results concerning the power of the Sum-of-Squares(SoS)/Lasserre hierarchy. For binary polynomial optimization problems of degree 2d2d and an odd number of variables nn, we prove that n+2d12\frac{n+2d-1}{2} levels of the SoS/Lasserre hierarchy are necessary to provide the exact optimal value. This matches the recent upper bound result by Sakaue, Takeda, Kim and Ito. Additionally, we study a conjecture by Laurent, who considered the linear representation of a set with no integral points. She showed that the Sherali-Adams hierarchy requires nn levels to detect the empty integer hull, and conjectured that the SoS/Lasserre rank for the same problem is n1n-1. We disprove this conjecture and derive lower and upper bounds for the rank
    corecore