1 research outputs found

    Nonlinear Rescaling as Interior Quadratic Prox Method in Convex Optimization

    No full text
    This paper is dedicated to Professor Elijah Polak on the occasion of his 75th birthday. A class Ψ of strictly concave and twice continuously differentiable functions ψ: R → R with particular properties is used for constraint transformation in the framework of a Nonlinear Rescaling (NR) method with “dynamic ” scaling parameter updates. We show that the NR method is equivalent to the Interior Quadratic Prox method for the dual problem in a rescaled dual space. The equivalence is used to prove convergence and to estimate the rate of convergence of the NR method and its dual equivalent under very mild assumptions on the input data for a wide class Ψ of constraint transformations. It is also used to estimate the rate of convergence under strict complementarity and under the standard second order optimality condition. We proved that for any ψ ∈ Ψ, which corresponds to a well-defined dual kernel ϕ = −ψ ∗ , the NR method applied to LP generates a quadratically convergent dual sequence if the dual LP has a unique solution. 1
    corecore