688 research outputs found
An Accelerated DC Programming Approach with Exact Line Search for The Symmetric Eigenvalue Complementarity Problem
In this paper, we are interested in developing an accelerated
Difference-of-Convex (DC) programming algorithm based on the exact line search
for efficiently solving the Symmetric Eigenvalue Complementarity Problem
(SEiCP) and Symmetric Quadratic Eigenvalue Complementarity Problem (SQEiCP). We
first proved that any SEiCP is equivalent to SEiCP with symmetric positive
definite matrices only. Then, we established DC programming formulations for
two equivalent formulations of SEiCP (namely, the logarithmic formulation and
the quadratic formulation), and proposed the accelerated DC algorithm (BDCA) by
combining the classical DCA with inexpensive exact line search by finding real
roots of a binomial for acceleration. We demonstrated the equivalence between
SQEiCP and SEiCP, and extended BDCA to SQEiCP. Numerical simulations of the
proposed BDCA and DCA against KNITRO, FILTERED and MATLAB FMINCON for SEiCP and
SQEiCP on both synthetic datasets and Matrix Market NEP Repository are
reported. BDCA demonstrated dramatic acceleration to the convergence of DCA to
get better numerical solutions, and outperformed KNITRO, FILTERED, and FMINCON
solvers in terms of the average CPU time and average solution precision,
especially for large-scale cases.Comment: 24 page
Sequential Convex Programming Methods for Solving Nonlinear Optimization Problems with DC constraints
This paper investigates the relation between sequential convex programming
(SCP) as, e.g., defined in [24] and DC (difference of two convex functions)
programming. We first present an SCP algorithm for solving nonlinear
optimization problems with DC constraints and prove its convergence. Then we
combine the proposed algorithm with a relaxation technique to handle
inconsistent linearizations. Numerical tests are performed to investigate the
behaviour of the class of algorithms.Comment: 18 pages, 1 figur
On Difference-of-SOS and Difference-of-Convex-SOS Decompositions for Polynomials
In this paper, we are interested in developing polynomial decomposition
techniques to reformulate real valued multivariate polynomials into
difference-of-sums-of-squares (namely, D-SOS) and
difference-of-convex-sums-of-squares (namely, DC-SOS). Firstly, we prove that
the set of D-SOS and DC-SOS polynomials are vector spaces and equivalent to the
set of real valued polynomials. Moreover, the problem of finding D-SOS and
DC-SOS decompositions are equivalent to semidefinite programs (SDP) which can
be solved to any desired precision in polynomial time. Some important algebraic
properties and the relationships among the set of sums-of-squares (SOS)
polynomials, positive semidefinite (PSD) polynomials, convex-sums-of-squares
(CSOS) polynomials, SOS-convex polynomials, D-SOS and DC-SOS polynomials are
discussed. Secondly, we focus on establishing several practical algorithms for
constructing D-SOS and DC-SOS decompositions for any polynomial without solving
SDP. Using DC-SOS decomposition, we can reformulate polynomial optimization
problems in the realm of difference-of-convex (DC) programming, which can be
handled by efficient DC programming approaches. Some examples illustrate how to
use our methods for constructing D-SOS and DC-SOS decompositions. Numerical
performance of D-SOS and DC-SOS decomposition algorithms and their parallelized
methods are tested on a synthetic dataset with 1750 randomly generated large
and small sized sparse and dense polynomials. Some real-world applications in
higher order moment portfolio optimization problems, eigenvalue complementarity
problems, Euclidean distance matrix completion problems, and Boolean polynomial
programs are also presented.Comment: 47 pages, 19 figure
Revisit of Spectral Bundle Methods: Primal-dual (Sub)linear Convergence Rates
The spectral bundle method proposed by Helmberg and Rendl is well established
for solving large-scale semidefinite programs (SDP) thanks to its low per
iteration computational complexity and strong practical performance. In this
paper, we revisit this classic method show-ing it achieves sublinear
convergence rates in terms of both primal and dual SDPs under merely strong
duality, complementing previous guarantees on primal-dual convergence.
Moreover, we show the method speeds up to linear convergence if (1)
structurally, the SDP admits strict complementarity, and (2) algorithmically,
the bundle method captures the rank of the optimal solutions. Such
complementary and low rank structure is prevalent in many modern and classical
applications. The linear convergent result is established via an eigenvalue
approximation lemma which might be of independent interests. Numerically, we
confirm our theoretical findings that the spectral bundle method, for modern
and classical applications, indeed speeds up under aforementioned conditionComment: 30 pages and 2 figure
A Boosted-DCA with Power-Sum-DC Decomposition for Linearly Constrained Polynomial Programs
This paper proposes a novel Difference-of-Convex (DC) decomposition for
polynomials using a power-sum representation, achieved by solving a sparse
linear system. We introduce the Boosted DCA with Exact Line Search (BDCAe) for
addressing linearly constrained polynomial programs within the DC framework.
Notably, we demonstrate that the exact line search equates to determining the
roots of a univariate polynomial in an interval, with coefficients being
computed explicitly based on the power-sum DC decompositions. The subsequential
convergence of BDCAe to critical points is proven, and its convergence rate
under the Kurdyka-Lojasiewicz property is established. To efficiently tackle
the convex subproblems, we integrate the Fast Dual Proximal Gradient (FDPG)
method by exploiting the separable block structure of the power-sum DC
decompositions. We validate our approach through numerical experiments on the
Mean-Variance-Skewness-Kurtosis (MVSK) portfolio optimization model and
box-constrained polynomial optimization problems. Comparative analysis of BDCAe
against DCA, BDCA with Armijo line search, UDCA, and UBDCA with projective DC
decomposition, alongside standard nonlinear optimization solvers FMINCON and
FILTERSD, substantiates the efficiency of our proposed approach.Comment: 39 pages, 5 figure
A Riemannian low-rank method for optimization over semidefinite matrices with block-diagonal constraints
We propose a new algorithm to solve optimization problems of the form for a smooth function under the constraints that is positive
semidefinite and the diagonal blocks of are small identity matrices. Such
problems often arise as the result of relaxing a rank constraint (lifting). In
particular, many estimation tasks involving phases, rotations, orthonormal
bases or permutations fit in this framework, and so do certain relaxations of
combinatorial problems such as Max-Cut. The proposed algorithm exploits the
facts that (1) such formulations admit low-rank solutions, and (2) their
rank-restricted versions are smooth optimization problems on a Riemannian
manifold. Combining insights from both the Riemannian and the convex geometries
of the problem, we characterize when second-order critical points of the smooth
problem reveal KKT points of the semidefinite problem. We compare against state
of the art, mature software and find that, on certain interesting problem
instances, what we call the staircase method is orders of magnitude faster, is
more accurate and scales better. Code is available.Comment: 37 pages, 3 figure
- …