Skip to main content
Article thumbnail
Location of Repository

Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization



Augmented Lagrangian methods for large-scale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, active-set box-constraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may be employed for computing internal search directions in the large-scale case. In this paper a minimal-memory quasi-Newton approach with secant preconditioners is proposed, taking into account the structure of Augmented Lagrangians that come from the popular Powell-Hestenes-Rockafellar scheme. A combined algorithm, that uses the quasi-Newton formula or a truncated-Newton procedure, depending on the presence of active constraints in the penalty-Lagrangian function, is also suggested. Numerical experiments using the Cute collection are presented

Topics: nonlinear programming, augmented Lagrangian methods, box constraints, quasi-Newton, truncated-Newton, BOUND-CONSTRAINED OPTIMIZATION, LINEAR-DEPENDENCE CONDITION, PROJECTED GRADIENT METHODS, UNCONSTRAINED MINIMIZATION, GUARANTEED DESCENT, CONVEX-SETS, ALGORITHM, BARZILAI, QUALIFICATION, CONVERGENCE, Operations Research & Management Science, Mathematics, Applied
Publisher: SPRINGER
Year: 2013
OAI identifier:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.