18,047 research outputs found
Sparse sum-of-squares (SOS) optimization: A bridge between DSOS/SDSOS and SOS optimization for sparse polynomials
Optimization over non-negative polynomials is fundamental for nonlinear
systems analysis and control. We investigate the relation between three
tractable relaxations for optimizing over sparse non-negative polynomials:
sparse sum-of-squares (SSOS) optimization, diagonally dominant sum-of-squares
(DSOS) optimization, and scaled diagonally dominant sum-of-squares (SDSOS)
optimization. We prove that the set of SSOS polynomials, an inner approximation
of the cone of SOS polynomials, strictly contains the spaces of sparse
DSOS/SDSOS polynomials. When applicable, therefore, SSOS optimization is less
conservative than its DSOS/SDSOS counterparts. Numerical results for
large-scale sparse polynomial optimization problems demonstrate this fact, and
also that SSOS optimization can be faster than DSOS/SDSOS methods despite
requiring the solution of semidefinite programs instead of less expensive
linear/second-order cone programs.Comment: 9 pages, 3 figure
Smaller SDP for SOS Decomposition
A popular numerical method to compute SOS (sum of squares of polynomials)
decompositions for polynomials is to transform the problem into semi-definite
programming (SDP) problems and then solve them by SDP solvers. In this paper,
we focus on reducing the sizes of inputs to SDP solvers to improve the
efficiency and reliability of those SDP based methods. Two types of
polynomials, convex cover polynomials and split polynomials, are defined. A
convex cover polynomial or a split polynomial can be decomposed into several
smaller sub-polynomials such that the original polynomial is SOS if and only if
the sub-polynomials are all SOS. Thus the original SOS problem can be
decomposed equivalently into smaller sub-problems. It is proved that convex
cover polynomials are split polynomials and it is quite possible that sparse
polynomials with many variables are split polynomials, which can be efficiently
detected in practice. Some necessary conditions for polynomials to be SOS are
also given, which can help refute quickly those polynomials which have no SOS
representations so that SDP solvers are not called in this case. All the new
results lead to a new SDP based method to compute SOS decompositions, which
improves this kind of methods by passing smaller inputs to SDP solvers in some
cases. Experiments show that the number of monomials obtained by our program is
often smaller than that by other SDP based software, especially for polynomials
with many variables and high degrees. Numerical results on various tests are
reported to show the performance of our program.Comment: 18 page
The power of sum-of-squares for detecting hidden structures
We study planted problems---finding hidden structures in random noisy
inputs---through the lens of the sum-of-squares semidefinite programming
hierarchy (SoS). This family of powerful semidefinite programs has recently
yielded many new algorithms for planted problems, often achieving the best
known polynomial-time guarantees in terms of accuracy of recovered solutions
and robustness to noise. One theme in recent work is the design of spectral
algorithms which match the guarantees of SoS algorithms for planted problems.
Classical spectral algorithms are often unable to accomplish this: the twist in
these new spectral algorithms is the use of spectral structure of matrices
whose entries are low-degree polynomials of the input variables. We prove that
for a wide class of planted problems, including refuting random constraint
satisfaction problems, tensor and sparse PCA, densest-k-subgraph, community
detection in stochastic block models, planted clique, and others, eigenvalues
of degree-d matrix polynomials are as powerful as SoS semidefinite programs of
roughly degree d. For such problems it is therefore always possible to match
the guarantees of SoS without solving a large semidefinite program. Using
related ideas on SoS algorithms and low-degree matrix polynomials (and inspired
by recent work on SoS and the planted clique problem by Barak et al.), we prove
new nearly-tight SoS lower bounds for the tensor and sparse principal component
analysis problems. Our lower bounds for sparse principal component analysis are
the first to suggest that going beyond existing algorithms for this problem may
require sub-exponential time
Stochastic collocation on unstructured multivariate meshes
Collocation has become a standard tool for approximation of parameterized
systems in the uncertainty quantification (UQ) community. Techniques for
least-squares regularization, compressive sampling recovery, and interpolatory
reconstruction are becoming standard tools used in a variety of applications.
Selection of a collocation mesh is frequently a challenge, but methods that
construct geometrically "unstructured" collocation meshes have shown great
potential due to attractive theoretical properties and direct, simple
generation and implementation. We investigate properties of these meshes,
presenting stability and accuracy results that can be used as guides for
generating stochastic collocation grids in multiple dimensions.Comment: 29 pages, 6 figure
- …