Inexact reduced gradient methods in smooth nonconvex optimization

Abstract

This paper proposes and develops new line search methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of line search methods are reviewed and extended. A general scheme for inexact reduced gradient (IRG) methods with different stepsize selections are proposed to construct sequences of iterates with stationary accumulation points. Convergence results with convergence rates for the developed IRG methods are established under the Kurdyka-Lojasiewicz property. The conducted numerical experiments confirm the efficiency of the proposed algorithms

    Similar works

    Full text

    thumbnail-image

    Available Versions