276 research outputs found
A Primal-Dual Augmented Lagrangian
Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we discuss the formulation of subproblems in which the objective is a primal-dual generalization of the Hestenes-Powell augmented Lagrangian function. This generalization has the crucial feature that it is minimized with respect to both the primal and the dual variables simultaneously. A benefit of this approach is that the quality of the dual variables is monitored explicitly during the solution of the subproblem. Moreover, each subproblem may be regularized by imposing explicit bounds on the dual variables. Two primal-dual variants of conventional primal methods are proposed: a primal-dual bound constrained Lagrangian (pdBCL) method and a primal-dual 1 linearly constrained Lagrangian (pd1-LCL) method
Global convergence of a stabilized sequential quadratic semidefinite programming method for nonlinear semidefinite programs without constraint qualifications
In this paper, we propose a new sequential quadratic semidefinite programming
(SQSDP) method for solving nonlinear semidefinite programs (NSDPs), in which we
produce iteration points by solving a sequence of stabilized quadratic
semidefinite programming (QSDP) subproblems, which we derive from the minimax
problem associated with the NSDP. Differently from the existing SQSDP methods,
the proposed one allows us to solve those QSDP subproblems just approximately
so as to ensure global convergence. One more remarkable point of the proposed
method is that any constraint qualifications (CQs) are not required in the
global convergence analysis. Specifically, under some assumptions without CQs,
we prove the global convergence to a point satisfying any of the following: the
stationary conditions for the feasibility problem; the
approximate-Karush-Kuhn-Tucker (AKKT) conditions; the trace-AKKT conditions.
The latter two conditions are the new optimality conditions for the NSDP
presented by Andreani et al. (2018) in place of the Karush-Kuhn-Tucker
conditions. Finally, we conduct some numerical experiments to examine the
efficiency of the proposed method
A globally convergent SQP-type method with least constraint violation for nonlinear semidefinite programming
We present a globally convergent SQP-type method with the least constraint
violation for nonlinear semidefinite programming. The proposed algorithm
employs a two-phase strategy coupled with a line search technique. In the first
phase, a subproblem based on a local model of infeasibility is formulated to
determine a corrective step. In the second phase, a search direction that moves
toward optimality is computed by minimizing a local model of the objective
function. Importantly, regardless of the feasibility of the original problem,
the iterative sequence generated by our proposed method converges to a
Fritz-John point of a transformed problem, wherein the constraint violation is
minimized. Numerical experiments have been conducted on various complex
scenarios to demonstrate the effectiveness of our approach.Comment: 34 page
Local convergence of a sequential quadratic programming method for a class of nonsmooth nonconvex objectives
A sequential quadratic programming (SQP) algorithm is designed for nonsmooth
optimization problems with upper-C^2 objective functions. Upper-C^2 functions
are locally equivalent to difference-of-convex (DC) functions with smooth
convex parts. They arise naturally in many applications such as certain classes
of solutions to parametric optimization problems, e.g., recourse of stochastic
programming, and projection onto closed sets. The proposed algorithm conducts
line search and adopts an exact penalty merit function. The potential
inconsistency due to the linearization of constraints are addressed through
relaxation, similar to that of Sl_1QP. We show that the algorithm is globally
convergent under reasonable assumptions. Moreover, we study the local
convergence behavior of the algorithm under additional assumptions of
Kurdyka-{\L}ojasiewicz (KL) properties, which have been applied to many
nonsmooth optimization problems. Due to the nonconvex nature of the problems, a
special potential function is used to analyze local convergence. We show that
under acceptable assumptions, upper bounds on local convergence can be proven.
Additionally, we show that for a large number of optimization problems with
upper-C^2 objectives, their corresponding potential functions are indeed KL
functions. Numerical experiment is performed with a power grid optimization
problem that is consistent with the assumptions and analysis in this paper
On the Burer-Monteiro method for general semidefinite programs
Consider a semidefinite program (SDP) involving an positive
semidefinite matrix . The Burer-Monteiro method uses the substitution to obtain a nonconvex optimization problem in terms of an
matrix . Boumal et al. showed that this nonconvex method provably solves
equality-constrained SDPs with a generic cost matrix when , where is the number of constraints. In this note we extend
their result to arbitrary SDPs, possibly involving inequalities or multiple
semidefinite constraints. We derive similar guarantees for a fixed cost matrix
and generic constraints. We illustrate applications to matrix sensing and
integer quadratic minimization.Comment: 10 page
- …