Location of Repository

## Variable metric inexact line-search-based methods for nonsmooth optimization

### Abstract

We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonconvex, function plus a convex, possibly nondifferentiable, function. The key features of the proposed method are the definition of a suitable descent direction, based on the proximal operator associated to the convex part of the objective function, and an Armijo-like rule to determine the stepsize along this direction ensuring the sufficient decrease of the objective function. In this frame, we especially address the possibility of adopting a metric which may change at each iteration and an inexact computation of the proximal point defining the descent direction. For the more general nonconvex case, we prove that all limit points of the iterates sequence are stationary, while for convex objective functions we prove the convergence of the whole sequence to a minimizer, under the assumption that a minimizer exists. In the latter case, assuming also that the gradient of the smooth part of the objective function is Lipschitz, we also give a convergence rate estimate, showing the ${\mathcal O}(\frac 1 k)$ complexity with respect to the function values. We also discuss verifiable sufficient conditions for the inexact proximal point and present the results of two numerical tests on total-variation-based image restoration problems, showing that the proposed approach is competitive with other state-of-the-art methods

Topics: proximal algorithms, nonsmooth optimization, generalized projection, nonconvex optimization
Year: 2016
DOI identifier: 10.1137/15M1019325
OAI identifier: oai:iris.unife.it:11392/2353213