1,720 research outputs found
A Multi-step Inertial Forward--Backward Splitting Method for Non-convex Optimization
In this paper, we propose a multi-step inertial Forward--Backward splitting
algorithm for minimizing the sum of two non-necessarily convex functions, one
of which is proper lower semi-continuous while the other is differentiable with
a Lipschitz continuous gradient. We first prove global convergence of the
scheme with the help of the Kurdyka-{\L}ojasiewicz property. Then, when the
non-smooth part is also partly smooth relative to a smooth submanifold, we
establish finite identification of the latter and provide sharp local linear
convergence analysis. The proposed method is illustrated on a few problems
arising from statistics and machine learning.Comment: This paper is in company with our recent work on
Forward--Backward-type splitting methods http://arxiv.org/abs/1503.0370
Newton-MR: Inexact Newton Method With Minimum Residual Sub-problem Solver
We consider a variant of inexact Newton Method, called Newton-MR, in which
the least-squares sub-problems are solved approximately using Minimum Residual
method. By construction, Newton-MR can be readily applied for unconstrained
optimization of a class of non-convex problems known as invex, which subsumes
convexity as a sub-class. For invex optimization, instead of the classical
Lipschitz continuity assumptions on gradient and Hessian, Newton-MR's global
convergence can be guaranteed under a weaker notion of joint regularity of
Hessian and gradient. We also obtain Newton-MR's problem-independent local
convergence to the set of minima. We show that fast local/global convergence
can be guaranteed under a novel inexactness condition, which, to our knowledge,
is much weaker than the prior related works. Numerical results demonstrate the
performance of Newton-MR as compared with several other Newton-type
alternatives on a few machine learning problems.Comment: 35 page
A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
We introduce Bella, a locally superlinearly convergent Bregman forward
backward splitting method for minimizing the sum of two nonconvex functions,
one of which satisfying a relative smoothness condition and the other one
possibly nonsmooth. A key tool of our methodology is the Bregman
forward-backward envelope (BFBE), an exact and continuous penalty function with
favorable first- and second-order properties, and enjoying a nonlinear error
bound when the objective function satisfies a Lojasiewicz-type property. The
proposed algorithm is of linesearch type over the BFBE along candidate update
directions, and converges subsequentially to stationary points, globally under
a KL condition, and owing to the given nonlinear error bound can attain
superlinear convergence rates even when the limit point is a nonisolated
minimum, provided the directions are suitably selected
Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms
We propose a unifying algorithm for non-smooth non-convex optimization. The
algorithm approximates the objective function by a convex model function and
finds an approximate (Bregman) proximal point of the convex model. This
approximate minimizer of the model function yields a descent direction, along
which the next iterate is found. Complemented with an Armijo-like line search
strategy, we obtain a flexible algorithm for which we prove (subsequential)
convergence to a stationary point under weak assumptions on the growth of the
model function error. Special instances of the algorithm with a Euclidean
distance function are, for example, Gradient Descent, Forward--Backward
Splitting, ProxDescent, without the common requirement of a "Lipschitz
continuous gradient". In addition, we consider a broad class of Bregman
distance functions (generated by Legendre functions) replacing the Euclidean
distance. The algorithm has a wide range of applications including many linear
and non-linear inverse problems in signal/image processing and machine
learning
- …