6,915 research outputs found
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
On Difference-of-SOS and Difference-of-Convex-SOS Decompositions for Polynomials
In this paper, we are interested in developing polynomial decomposition
techniques to reformulate real valued multivariate polynomials into
difference-of-sums-of-squares (namely, D-SOS) and
difference-of-convex-sums-of-squares (namely, DC-SOS). Firstly, we prove that
the set of D-SOS and DC-SOS polynomials are vector spaces and equivalent to the
set of real valued polynomials. Moreover, the problem of finding D-SOS and
DC-SOS decompositions are equivalent to semidefinite programs (SDP) which can
be solved to any desired precision in polynomial time. Some important algebraic
properties and the relationships among the set of sums-of-squares (SOS)
polynomials, positive semidefinite (PSD) polynomials, convex-sums-of-squares
(CSOS) polynomials, SOS-convex polynomials, D-SOS and DC-SOS polynomials are
discussed. Secondly, we focus on establishing several practical algorithms for
constructing D-SOS and DC-SOS decompositions for any polynomial without solving
SDP. Using DC-SOS decomposition, we can reformulate polynomial optimization
problems in the realm of difference-of-convex (DC) programming, which can be
handled by efficient DC programming approaches. Some examples illustrate how to
use our methods for constructing D-SOS and DC-SOS decompositions. Numerical
performance of D-SOS and DC-SOS decomposition algorithms and their parallelized
methods are tested on a synthetic dataset with 1750 randomly generated large
and small sized sparse and dense polynomials. Some real-world applications in
higher order moment portfolio optimization problems, eigenvalue complementarity
problems, Euclidean distance matrix completion problems, and Boolean polynomial
programs are also presented.Comment: 47 pages, 19 figure
Exploiting symmetries in SDP-relaxations for polynomial optimization
In this paper we study various approaches for exploiting symmetries in
polynomial optimization problems within the framework of semi definite
programming relaxations. Our special focus is on constrained problems
especially when the symmetric group is acting on the variables. In particular,
we investigate the concept of block decomposition within the framework of
constrained polynomial optimization problems, show how the degree principle for
the symmetric group can be computationally exploited and also propose some
methods to efficiently compute in the geometric quotient.Comment: (v3) Minor revision. To appear in Math. of Operations Researc
Sum-of-squares proofs and the quest toward optimal algorithms
In order to obtain the best-known guarantees, algorithms are traditionally
tailored to the particular problem we want to solve. Two recent developments,
the Unique Games Conjecture (UGC) and the Sum-of-Squares (SOS) method,
surprisingly suggest that this tailoring is not necessary and that a single
efficient algorithm could achieve best possible guarantees for a wide range of
different problems.
The Unique Games Conjecture (UGC) is a tantalizing conjecture in
computational complexity, which, if true, will shed light on the complexity of
a great many problems. In particular this conjecture predicts that a single
concrete algorithm provides optimal guarantees among all efficient algorithms
for a large class of computational problems.
The Sum-of-Squares (SOS) method is a general approach for solving systems of
polynomial constraints. This approach is studied in several scientific
disciplines, including real algebraic geometry, proof complexity, control
theory, and mathematical programming, and has found applications in fields as
diverse as quantum information theory, formal verification, game theory and
many others.
We survey some connections that were recently uncovered between the Unique
Games Conjecture and the Sum-of-Squares method. In particular, we discuss new
tools to rigorously bound the running time of the SOS method for obtaining
approximate solutions to hard optimization problems, and how these tools give
the potential for the sum-of-squares method to provide new guarantees for many
problems of interest, and possibly to even refute the UGC.Comment: Survey. To appear in proceedings of ICM 201
Global optimization of polynomials using gradient tentacles and sums of squares
In this work, the combine the theory of generalized critical values with the
theory of iterated rings of bounded elements (real holomorphy rings).
We consider the problem of computing the global infimum of a real polynomial
in several variables. Every global minimizer lies on the gradient variety. If
the polynomial attains a minimum, it is therefore equivalent to look for the
greatest lower bound on its gradient variety. Nie, Demmel and Sturmfels proved
recently a theorem about the existence of sums of squares certificates for such
lower bounds. Based on these certificates, they find arbitrarily tight
relaxations of the original problem that can be formulated as semidefinite
programs and thus be solved efficiently.
We deal here with the more general case when the polynomial is bounded from
belo w but does not necessarily attain a minimum. In this case, the method of
Nie, Demmel and Sturmfels might yield completely wrong results. In order to
overcome this problem, we replace the gradient variety by larger semialgebraic
sets which we call gradient tentacles. It now gets substantially harder to
prove the existence of the necessary sums of squares certificates.Comment: 22 page
- ā¦