318 research outputs found

    Conic Optimization Theory: Convexification Techniques and Numerical Algorithms

    Full text link
    Optimization is at the core of control theory and appears in several areas of this field, such as optimal control, distributed control, system identification, robust control, state estimation, model predictive control and dynamic programming. The recent advances in various topics of modern optimization have also been revamping the area of machine learning. Motivated by the crucial role of optimization theory in the design, analysis, control and operation of real-world systems, this tutorial paper offers a detailed overview of some major advances in this area, namely conic optimization and its emerging applications. First, we discuss the importance of conic optimization in different areas. Then, we explain seminal results on the design of hierarchies of convex relaxations for a wide range of nonconvex problems. Finally, we study different numerical algorithms for large-scale conic optimization problems.Comment: 18 page

    Decomposition in conic optimization with partially separable structure

    Get PDF
    Decomposition techniques for linear programming are difficult to extend to conic optimization problems with general non-polyhedral convex cones because the conic inequalities introduce an additional nonlinear coupling between the variables. However in many applications the convex cones have a partially separable structure that allows them to be characterized in terms of simpler lower-dimensional cones. The most important example is sparse semidefinite programming with a chordal sparsity pattern. Here partial separability derives from the clique decomposition theorems that characterize positive semidefinite and positive-semidefinite-completable matrices with chordal sparsity patterns. The paper describes a decomposition method that exploits partial separability in conic linear optimization. The method is based on Spingarn's method for equality constrained convex optimization, combined with a fast interior-point method for evaluating proximal operators
    corecore