research

Accelerating convergence of a Separable Augmented Lagrangian Algorithm

Abstract

We analyze the numerical behaviour of a separable Augmented Lagrangian algorithm with multiple scaling parameters, different parameters associated with each dualized coupling constraint as well as with each subproblem. We show that an optimal superlinear rate of convergence can be theoretically attained in the twice differentiable case and propose an adaptive scaling strategy with the same ideal convergence properties. Numerical tests performed on quadratic programs confirm that Adaptive Global Scaling subsumes former scaling strategies with one or many parameters

    Similar works