163 research outputs found
Sparse sum-of-squares (SOS) optimization: A bridge between DSOS/SDSOS and SOS optimization for sparse polynomials
Optimization over non-negative polynomials is fundamental for nonlinear
systems analysis and control. We investigate the relation between three
tractable relaxations for optimizing over sparse non-negative polynomials:
sparse sum-of-squares (SSOS) optimization, diagonally dominant sum-of-squares
(DSOS) optimization, and scaled diagonally dominant sum-of-squares (SDSOS)
optimization. We prove that the set of SSOS polynomials, an inner approximation
of the cone of SOS polynomials, strictly contains the spaces of sparse
DSOS/SDSOS polynomials. When applicable, therefore, SSOS optimization is less
conservative than its DSOS/SDSOS counterparts. Numerical results for
large-scale sparse polynomial optimization problems demonstrate this fact, and
also that SSOS optimization can be faster than DSOS/SDSOS methods despite
requiring the solution of semidefinite programs instead of less expensive
linear/second-order cone programs.Comment: 9 pages, 3 figure
Improving Efficiency and Scalability of Sum of Squares Optimization: Recent Advances and Limitations
It is well-known that any sum of squares (SOS) program can be cast as a
semidefinite program (SDP) of a particular structure and that therein lies the
computational bottleneck for SOS programs, as the SDPs generated by this
procedure are large and costly to solve when the polynomials involved in the
SOS programs have a large number of variables and degree. In this paper, we
review SOS optimization techniques and present two new methods for improving
their computational efficiency. The first method leverages the sparsity of the
underlying SDP to obtain computational speed-ups. Further improvements can be
obtained if the coefficients of the polynomials that describe the problem have
a particular sparsity pattern, called chordal sparsity. The second method
bypasses semidefinite programming altogether and relies instead on solving a
sequence of more tractable convex programs, namely linear and second order cone
programs. This opens up the question as to how well one can approximate the
cone of SOS polynomials by second order representable cones. In the last part
of the paper, we present some recent negative results related to this question.Comment: Tutorial for CDC 201
Block Factor-width-two Matrices and Their Applications to Semidefinite and Sum-of-squares Optimization
Semidefinite and sum-of-squares (SOS) optimization are fundamental
computational tools in many areas, including linear and nonlinear systems
theory. However, the scale of problems that can be addressed reliably and
efficiently is still limited. In this paper, we introduce a new notion of
\emph{block factor-width-two matrices} and build a new hierarchy of inner and
outer approximations of the cone of positive semidefinite (PSD) matrices. This
notion is a block extension of the standard factor-width-two matrices, and
allows for an improved inner-approximation of the PSD cone. In the context of
SOS optimization, this leads to a block extension of the \emph{scaled
diagonally dominant sum-of-squares (SDSOS)} polynomials. By varying a matrix
partition, the notion of block factor-width-two matrices can balance a
trade-off between the computation scalability and solution quality for solving
semidefinite and SOS optimization. Numerical experiments on large-scale
instances confirm our theoretical findings.Comment: 26 pages, 5 figures. Added a new section on the approximation quality
analysis using block factor-width-two matrices. Code is available through
https://github.com/zhengy09/SDPf
New Dependencies of Hierarchies in Polynomial Optimization
We compare four key hierarchies for solving Constrained Polynomial
Optimization Problems (CPOP): Sum of Squares (SOS), Sum of Diagonally Dominant
Polynomials (SDSOS), Sum of Nonnegative Circuits (SONC), and the Sherali Adams
(SA) hierarchies. We prove a collection of dependencies among these hierarchies
both for general CPOPs and for optimization problems on the Boolean hypercube.
Key results include for the general case that the SONC and SOS hierarchy are
polynomially incomparable, while SDSOS is contained in SONC. A direct
consequence is the non-existence of a Putinar-like Positivstellensatz for
SDSOS. On the Boolean hypercube, we show as a main result that Schm\"udgen-like
versions of the hierarchies SDSOS*, SONC*, and SA* are polynomially equivalent.
Moreover, we show that SA* is contained in any Schm\"udgen-like hierarchy that
provides a O(n) degree bound.Comment: 26 pages, 4 figure
- …