681 research outputs found
A Newton-bracketing method for a simple conic optimization problem
For the Lagrangian-DNN relaxation of quadratic optimization problems (QOPs),
we propose a Newton-bracketing method to improve the performance of the
bisection-projection method implemented in BBCPOP [to appear in ACM Tran.
Softw., 2019]. The relaxation problem is converted into the problem of finding
the largest zero of a continuously differentiable (except at )
convex function such that if
and otherwise. In theory, the method generates lower
and upper bounds of both converging to . Their convergence is
quadratic if the right derivative of at is positive. Accurate
computation of is necessary for the robustness of the method, but it is
difficult to achieve in practice. As an alternative, we present a
secant-bracketing method. We demonstrate that the method improves the quality
of the lower bounds obtained by BBCPOP and SDPNAL+ for binary QOP instances
from BIQMAC. Moreover, new lower bounds for the unknown optimal values of large
scale QAP instances from QAPLIB are reported.Comment: 19 pages, 2 figure
Projection methods in conic optimization
There exist efficient algorithms to project a point onto the intersection of
a convex cone and an affine subspace. Those conic projections are in turn the
work-horse of a range of algorithms in conic optimization, having a variety of
applications in science, finance and engineering. This chapter reviews some of
these algorithms, emphasizing the so-called regularization algorithms for
linear conic optimization, and applications in polynomial optimization. This is
a presentation of the material of several recent research articles; we aim here
at clarifying the ideas, presenting them in a general framework, and pointing
out important techniques
A second order cone formulation of continuous CTA model
The final publication is available at link.springer.comIn this paper we consider a minimum distance Controlled Tabular Adjustment (CTA) model for statistical disclosure limitation (control) of tabular data. The goal of the CTA model is to find the closest safe table to some original tabular data set that contains sensitive information. The measure of closeness is usually measured using l1 or l2 norm; with each measure having its advantages and disadvantages. Recently, in [4] a regularization of the l1 -CTA using Pseudo-Huber func- tion was introduced in an attempt to combine positive characteristics of both l1 -CTA and l2 -CTA. All three models can be solved using appro- priate versions of Interior-Point Methods (IPM). It is known that IPM in general works better on well structured problems such as conic op- timization problems, thus, reformulation of these CTA models as conic optimization problem may be advantageous. We present reformulation of Pseudo-Huber-CTA, and l1 -CTA as Second-Order Cone (SOC) op- timization problems and test the validity of the approach on the small example of two-dimensional tabular data set.Peer ReviewedPostprint (author's final draft
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
- …