research

STABILITY PROPERTIES OF THE GRADIENT PROJECTION METHOD WITH APPLICATIONS TO THE BACKPROPAGATION ALGORITHM

Abstract

Convergence properties of the generalized gradient projection algorithm in the presence of data perturbations are investigated. It is shown that every trajectory of the method is attracted, in a certain sense, to an ?-stationary set of the problem, where ? depends on the magnitude of the perturbations. Estimates for the attraction sets of the iterates are given in the general (nonsmooth and nonconvex) case. In the convex case, our results imply convergence to an ?-optimal set. The results are further strengthened for weakly short and strong convex problems. Convergence of the parallel algorithm in the case of the additive objective function is established. One of the principal applications of our results is the stability analysis of the classical backpropagation algorithm for training artificial neural networks

    Similar works