370 research outputs found
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
Improving Efficiency and Scalability of Sum of Squares Optimization: Recent Advances and Limitations
It is well-known that any sum of squares (SOS) program can be cast as a
semidefinite program (SDP) of a particular structure and that therein lies the
computational bottleneck for SOS programs, as the SDPs generated by this
procedure are large and costly to solve when the polynomials involved in the
SOS programs have a large number of variables and degree. In this paper, we
review SOS optimization techniques and present two new methods for improving
their computational efficiency. The first method leverages the sparsity of the
underlying SDP to obtain computational speed-ups. Further improvements can be
obtained if the coefficients of the polynomials that describe the problem have
a particular sparsity pattern, called chordal sparsity. The second method
bypasses semidefinite programming altogether and relies instead on solving a
sequence of more tractable convex programs, namely linear and second order cone
programs. This opens up the question as to how well one can approximate the
cone of SOS polynomials by second order representable cones. In the last part
of the paper, we present some recent negative results related to this question.Comment: Tutorial for CDC 201
Sparse sum-of-squares (SOS) optimization: A bridge between DSOS/SDSOS and SOS optimization for sparse polynomials
Optimization over non-negative polynomials is fundamental for nonlinear
systems analysis and control. We investigate the relation between three
tractable relaxations for optimizing over sparse non-negative polynomials:
sparse sum-of-squares (SSOS) optimization, diagonally dominant sum-of-squares
(DSOS) optimization, and scaled diagonally dominant sum-of-squares (SDSOS)
optimization. We prove that the set of SSOS polynomials, an inner approximation
of the cone of SOS polynomials, strictly contains the spaces of sparse
DSOS/SDSOS polynomials. When applicable, therefore, SSOS optimization is less
conservative than its DSOS/SDSOS counterparts. Numerical results for
large-scale sparse polynomial optimization problems demonstrate this fact, and
also that SSOS optimization can be faster than DSOS/SDSOS methods despite
requiring the solution of semidefinite programs instead of less expensive
linear/second-order cone programs.Comment: 9 pages, 3 figure
Certification of Bounds of Non-linear Functions: the Templates Method
The aim of this work is to certify lower bounds for real-valued multivariate
functions, defined by semialgebraic or transcendental expressions. The
certificate must be, eventually, formally provable in a proof system such as
Coq. The application range for such a tool is widespread; for instance Hales'
proof of Kepler's conjecture yields thousands of inequalities. We introduce an
approximation algorithm, which combines ideas of the max-plus basis method (in
optimal control) and of the linear templates method developed by Manna et al.
(in static analysis). This algorithm consists in bounding some of the
constituents of the function by suprema of quadratic forms with a well chosen
curvature. This leads to semialgebraic optimization problems, solved by
sum-of-squares relaxations. Templates limit the blow up of these relaxations at
the price of coarsening the approximation. We illustrate the efficiency of our
framework with various examples from the literature and discuss the interfacing
with Coq.Comment: 16 pages, 3 figures, 2 table
Smaller SDP for SOS Decomposition
A popular numerical method to compute SOS (sum of squares of polynomials)
decompositions for polynomials is to transform the problem into semi-definite
programming (SDP) problems and then solve them by SDP solvers. In this paper,
we focus on reducing the sizes of inputs to SDP solvers to improve the
efficiency and reliability of those SDP based methods. Two types of
polynomials, convex cover polynomials and split polynomials, are defined. A
convex cover polynomial or a split polynomial can be decomposed into several
smaller sub-polynomials such that the original polynomial is SOS if and only if
the sub-polynomials are all SOS. Thus the original SOS problem can be
decomposed equivalently into smaller sub-problems. It is proved that convex
cover polynomials are split polynomials and it is quite possible that sparse
polynomials with many variables are split polynomials, which can be efficiently
detected in practice. Some necessary conditions for polynomials to be SOS are
also given, which can help refute quickly those polynomials which have no SOS
representations so that SDP solvers are not called in this case. All the new
results lead to a new SDP based method to compute SOS decompositions, which
improves this kind of methods by passing smaller inputs to SDP solvers in some
cases. Experiments show that the number of monomials obtained by our program is
often smaller than that by other SDP based software, especially for polynomials
with many variables and high degrees. Numerical results on various tests are
reported to show the performance of our program.Comment: 18 page
- …