441 research outputs found
Projection methods in conic optimization
There exist efficient algorithms to project a point onto the intersection of
a convex cone and an affine subspace. Those conic projections are in turn the
work-horse of a range of algorithms in conic optimization, having a variety of
applications in science, finance and engineering. This chapter reviews some of
these algorithms, emphasizing the so-called regularization algorithms for
linear conic optimization, and applications in polynomial optimization. This is
a presentation of the material of several recent research articles; we aim here
at clarifying the ideas, presenting them in a general framework, and pointing
out important techniques
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
Chordal Decomposition in Rank Minimized Semidefinite Programs with Applications to Subspace Clustering
Semidefinite programs (SDPs) often arise in relaxations of some NP-hard
problems, and if the solution of the SDP obeys certain rank constraints, the
relaxation will be tight. Decomposition methods based on chordal sparsity have
already been applied to speed up the solution of sparse SDPs, but methods for
dealing with rank constraints are underdeveloped. This paper leverages a
minimum rank completion result to decompose the rank constraint on a single
large matrix into multiple rank constraints on a set of smaller matrices. The
re-weighted heuristic is used as a proxy for rank, and the specific form of the
heuristic preserves the sparsity pattern between iterations. Implementations of
rank-minimized SDPs through interior-point and first-order algorithms are
discussed. The problem of subspace clustering is used to demonstrate the
computational improvement of the proposed method.Comment: 6 pages, 6 figure
- …