Minimization Of Nonsmooth Convex Functionals In Banach Spaces

Abstract

We develop a unified framework for convergence analysis of subgradient and subgradient projection methods for minimization of nonsmooth convex functionals in Banach spaces. The important novel features of our analysis are that we neither assume that the functional is uniformly or strongly convex, nor use regularization techniques. Moreover, no boundedness assumptions are made on the level sets of the functional or the feasible set of the problem. In fact, the solution set can be unbounded. Under very mild assumptions, we prove that the sequence of iterates is bounded and it has at least one weak accumulation point which is a minimizer. Moreover, all weak accumulation points of the sequence of Ces`aro averages of the iterates are solutions of the minimization problem. Under certain additional assumptions (which are satisfied for several important instances of Banach spaces), we are able to exhibit weak convergence of the whole sequence of iterates to one of the solutions of the optimiza..

    Similar works

    Full text

    thumbnail-image

    Available Versions