4,821 research outputs found
Improving Efficiency and Scalability of Sum of Squares Optimization: Recent Advances and Limitations
It is well-known that any sum of squares (SOS) program can be cast as a
semidefinite program (SDP) of a particular structure and that therein lies the
computational bottleneck for SOS programs, as the SDPs generated by this
procedure are large and costly to solve when the polynomials involved in the
SOS programs have a large number of variables and degree. In this paper, we
review SOS optimization techniques and present two new methods for improving
their computational efficiency. The first method leverages the sparsity of the
underlying SDP to obtain computational speed-ups. Further improvements can be
obtained if the coefficients of the polynomials that describe the problem have
a particular sparsity pattern, called chordal sparsity. The second method
bypasses semidefinite programming altogether and relies instead on solving a
sequence of more tractable convex programs, namely linear and second order cone
programs. This opens up the question as to how well one can approximate the
cone of SOS polynomials by second order representable cones. In the last part
of the paper, we present some recent negative results related to this question.Comment: Tutorial for CDC 201
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
Semidefinite representation for convex hulls of real algebraic curves
We show that the closed convex hull of any one-dimensional semi-algebraic
subset of R^n has a semidefinite representation, meaning that it can be written
as a linear projection of the solution set of some linear matrix inequality.
This is proved by an application of the moment relaxation method. Given a
nonsingular affine real algebraic curve C and a compact semialgebraic subset K
of its R-points, the preordering P(K) of all regular functions on C that are
nonnegative on K is known to be finitely generated. We prove that P(K) is
stable, meaning that uniform degree bounds exist for weighted sum of squares
representations of elements of P(K). We also extend this last result to the
case where K is only virtually compact. The main technical tool for the proof
of stability is the archimedean local-global principle. As a consequence of our
results we prove that every convex semialgebraic subset of R^2 has a
semidefinite representation.Comment: v2: 19 pp (Section 6 is new); v3: 19 pp (small issues fixed); v4:
updated and slightly expande
- …