1,200 research outputs found
Smoothed analysis of the low-rank approach for smooth semidefinite programs
We consider semidefinite programs (SDPs) of size n with equality constraints.
In order to overcome scalability issues, Burer and Monteiro proposed a
factorized approach based on optimizing over a matrix Y of size by such
that is the SDP variable. The advantages of such formulation are
twofold: the dimension of the optimization variable is reduced and positive
semidefiniteness is naturally enforced. However, the problem in Y is
non-convex. In prior work, it has been shown that, when the constraints on the
factorized variable regularly define a smooth manifold, provided k is large
enough, for almost all cost matrices, all second-order stationary points
(SOSPs) are optimal. Importantly, in practice, one can only compute points
which approximately satisfy necessary optimality conditions, leading to the
question: are such points also approximately optimal? To this end, and under
similar assumptions, we use smoothed analysis to show that approximate SOSPs
for a randomly perturbed objective function are approximate global optima, with
k scaling like the square root of the number of constraints (up to log
factors). Moreover, we bound the optimality gap at the approximate solution of
the perturbed problem with respect to the original problem. We particularize
our results to an SDP relaxation of phase retrieval
Block Factor-width-two Matrices and Their Applications to Semidefinite and Sum-of-squares Optimization
Semidefinite and sum-of-squares (SOS) optimization are fundamental
computational tools in many areas, including linear and nonlinear systems
theory. However, the scale of problems that can be addressed reliably and
efficiently is still limited. In this paper, we introduce a new notion of
\emph{block factor-width-two matrices} and build a new hierarchy of inner and
outer approximations of the cone of positive semidefinite (PSD) matrices. This
notion is a block extension of the standard factor-width-two matrices, and
allows for an improved inner-approximation of the PSD cone. In the context of
SOS optimization, this leads to a block extension of the \emph{scaled
diagonally dominant sum-of-squares (SDSOS)} polynomials. By varying a matrix
partition, the notion of block factor-width-two matrices can balance a
trade-off between the computation scalability and solution quality for solving
semidefinite and SOS optimization. Numerical experiments on large-scale
instances confirm our theoretical findings.Comment: 26 pages, 5 figures. Added a new section on the approximation quality
analysis using block factor-width-two matrices. Code is available through
https://github.com/zhengy09/SDPf
Efficient SDP Inference for Fully-connected CRFs Based on Low-rank Decomposition
Conditional Random Fields (CRF) have been widely used in a variety of
computer vision tasks. Conventional CRFs typically define edges on neighboring
image pixels, resulting in a sparse graph such that efficient inference can be
performed. However, these CRFs fail to model long-range contextual
relationships. Fully-connected CRFs have thus been proposed. While there are
efficient approximate inference methods for such CRFs, usually they are
sensitive to initialization and make strong assumptions. In this work, we
develop an efficient, yet general algorithm for inference on fully-connected
CRFs. The algorithm is based on a scalable SDP algorithm and the low- rank
approximation of the similarity/kernel matrix. The core of the proposed
algorithm is a tailored quasi-Newton method that takes advantage of the
low-rank matrix approximation when solving the specialized SDP dual problem.
Experiments demonstrate that our method can be applied on fully-connected CRFs
that cannot be solved previously, such as pixel-level image co-segmentation.Comment: 15 pages. A conference version of this work appears in Proc. IEEE
Conference on Computer Vision and Pattern Recognition, 201
- …