31,190 research outputs found
Robust Rotation Synchronization via Low-rank and Sparse Matrix Decomposition
This paper deals with the rotation synchronization problem, which arises in
global registration of 3D point-sets and in structure from motion. The problem
is formulated in an unprecedented way as a "low-rank and sparse" matrix
decomposition that handles both outliers and missing data. A minimization
strategy, dubbed R-GoDec, is also proposed and evaluated experimentally against
state-of-the-art algorithms on simulated and real data. The results show that
R-GoDec is the fastest among the robust algorithms.Comment: The material contained in this paper is part of a manuscript
submitted to CVI
Distributed Robustness Analysis of Interconnected Uncertain Systems Using Chordal Decomposition
Large-scale interconnected uncertain systems commonly have large state and
uncertainty dimensions. Aside from the heavy computational cost of solving
centralized robust stability analysis techniques, privacy requirements in the
network can also introduce further issues. In this paper, we utilize IQC
analysis for analyzing large-scale interconnected uncertain systems and we
evade these issues by describing a decomposition scheme that is based on the
interconnection structure of the system. This scheme is based on the so-called
chordal decomposition and does not add any conservativeness to the analysis
approach. The decomposed problem can be solved using distributed computational
algorithms without the need for a centralized computational unit. We further
discuss the merits of the proposed analysis approach using a numerical
experiment.Comment: 3 figures. Submitted to the 19th IFAC world congres
Sparse sum-of-squares (SOS) optimization: A bridge between DSOS/SDSOS and SOS optimization for sparse polynomials
Optimization over non-negative polynomials is fundamental for nonlinear
systems analysis and control. We investigate the relation between three
tractable relaxations for optimizing over sparse non-negative polynomials:
sparse sum-of-squares (SSOS) optimization, diagonally dominant sum-of-squares
(DSOS) optimization, and scaled diagonally dominant sum-of-squares (SDSOS)
optimization. We prove that the set of SSOS polynomials, an inner approximation
of the cone of SOS polynomials, strictly contains the spaces of sparse
DSOS/SDSOS polynomials. When applicable, therefore, SSOS optimization is less
conservative than its DSOS/SDSOS counterparts. Numerical results for
large-scale sparse polynomial optimization problems demonstrate this fact, and
also that SSOS optimization can be faster than DSOS/SDSOS methods despite
requiring the solution of semidefinite programs instead of less expensive
linear/second-order cone programs.Comment: 9 pages, 3 figure
Improving Efficiency and Scalability of Sum of Squares Optimization: Recent Advances and Limitations
It is well-known that any sum of squares (SOS) program can be cast as a
semidefinite program (SDP) of a particular structure and that therein lies the
computational bottleneck for SOS programs, as the SDPs generated by this
procedure are large and costly to solve when the polynomials involved in the
SOS programs have a large number of variables and degree. In this paper, we
review SOS optimization techniques and present two new methods for improving
their computational efficiency. The first method leverages the sparsity of the
underlying SDP to obtain computational speed-ups. Further improvements can be
obtained if the coefficients of the polynomials that describe the problem have
a particular sparsity pattern, called chordal sparsity. The second method
bypasses semidefinite programming altogether and relies instead on solving a
sequence of more tractable convex programs, namely linear and second order cone
programs. This opens up the question as to how well one can approximate the
cone of SOS polynomials by second order representable cones. In the last part
of the paper, we present some recent negative results related to this question.Comment: Tutorial for CDC 201
- …