5,046 research outputs found
A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables
We propose a gradient-based method for quadratic programming problems with a
single linear constraint and bounds on the variables. Inspired by the GPCG
algorithm for bound-constrained convex quadratic programming [J.J. Mor\'e and
G. Toraldo, SIAM J. Optim. 1, 1991], our approach alternates between two phases
until convergence: an identification phase, which performs gradient projection
iterations until either a candidate active set is identified or no reasonable
progress is made, and an unconstrained minimization phase, which reduces the
objective function in a suitable space defined by the identification phase, by
applying either the conjugate gradient method or a recently proposed spectral
gradient method. However, the algorithm differs from GPCG not only because it
deals with a more general class of problems, but mainly for the way it stops
the minimization phase. This is based on a comparison between a measure of
optimality in the reduced space and a measure of bindingness of the variables
that are on the bounds, defined by extending the concept of proportioning,
which was proposed by some authors for box-constrained problems. If the
objective function is bounded, the algorithm converges to a stationary point
thanks to a suitable application of the gradient projection method in the
identification phase. For strictly convex problems, the algorithm converges to
the optimal solution in a finite number of steps even in case of degeneracy.
Extensive numerical experiments show the effectiveness of the proposed
approach.Comment: 30 pages, 17 figure
An Algorithm for Global Maximization of Secrecy Rates in Gaussian MIMO Wiretap Channels
Optimal signaling for secrecy rate maximization in Gaussian MIMO wiretap
channels is considered. While this channel has attracted a significant
attention recently and a number of results have been obtained, including the
proof of the optimality of Gaussian signalling, an optimal transmit covariance
matrix is known for some special cases only and the general case remains an
open problem. An iterative custom-made algorithm to find a globally-optimal
transmit covariance matrix in the general case is developed in this paper, with
guaranteed convergence to a \textit{global} optimum. While the original
optimization problem is not convex and hence difficult to solve, its minimax
reformulation can be solved via the convex optimization tools, which is
exploited here. The proposed algorithm is based on the barrier method extended
to deal with a minimax problem at hand. Its convergence to a global optimum is
proved for the general case (degraded or not) and a bound for the optimality
gap is given for each step of the barrier method. The performance of the
algorithm is demonstrated via numerical examples. In particular, 20 to 40
Newton steps are already sufficient to solve the sufficient optimality
conditions with very high precision (up to the machine precision level), even
for large systems. Even fewer steps are required if the secrecy capacity is the
only quantity of interest. The algorithm can be significantly simplified for
the degraded channel case and can also be adopted to include the per-antenna
power constraints (instead or in addition to the total power constraint). It
also solves the dual problem of minimizing the total power subject to the
secrecy rate constraint.Comment: accepted by IEEE Transactions on Communication
On the Burer-Monteiro method for general semidefinite programs
Consider a semidefinite program (SDP) involving an positive
semidefinite matrix . The Burer-Monteiro method uses the substitution to obtain a nonconvex optimization problem in terms of an
matrix . Boumal et al. showed that this nonconvex method provably solves
equality-constrained SDPs with a generic cost matrix when , where is the number of constraints. In this note we extend
their result to arbitrary SDPs, possibly involving inequalities or multiple
semidefinite constraints. We derive similar guarantees for a fixed cost matrix
and generic constraints. We illustrate applications to matrix sensing and
integer quadratic minimization.Comment: 10 page
- …