1,199 research outputs found
On the Convergence of (Stochastic) Gradient Descent with Extrapolation for Non-Convex Optimization
Extrapolation is a well-known technique for solving convex optimization and
variational inequalities and recently attracts some attention for non-convex
optimization. Several recent works have empirically shown its success in some
machine learning tasks. However, it has not been analyzed for non-convex
minimization and there still remains a gap between the theory and the practice.
In this paper, we analyze gradient descent and stochastic gradient descent with
extrapolation for finding an approximate first-order stationary point in smooth
non-convex optimization problems. Our convergence upper bounds show that the
algorithms with extrapolation can be accelerated than without extrapolation
Generalized Forward-Backward Splitting
This paper introduces the generalized forward-backward splitting algorithm
for minimizing convex functions of the form , where
has a Lipschitz-continuous gradient and the 's are simple in the sense
that their Moreau proximity operators are easy to compute. While the
forward-backward algorithm cannot deal with more than non-smooth
function, our method generalizes it to the case of arbitrary . Our method
makes an explicit use of the regularity of in the forward step, and the
proximity operators of the 's are applied in parallel in the backward
step. This allows the generalized forward backward to efficiently address an
important class of convex problems. We prove its convergence in infinite
dimension, and its robustness to errors on the computation of the proximity
operators and of the gradient of . Examples on inverse problems in imaging
demonstrate the advantage of the proposed methods in comparison to other
splitting algorithms.Comment: 24 pages, 4 figure
Stochastic Quasi-Fej\'er Block-Coordinate Fixed Point Iterations with Random Sweeping
This work proposes block-coordinate fixed point algorithms with applications
to nonlinear analysis and optimization in Hilbert spaces. The asymptotic
analysis relies on a notion of stochastic quasi-Fej\'er monotonicity, which is
thoroughly investigated. The iterative methods under consideration feature
random sweeping rules to select arbitrarily the blocks of variables that are
activated over the course of the iterations and they allow for stochastic
errors in the evaluation of the operators. Algorithms using quasinonexpansive
operators or compositions of averaged nonexpansive operators are constructed,
and weak and strong convergence results are established for the sequences they
generate. As a by-product, novel block-coordinate operator splitting methods
are obtained for solving structured monotone inclusion and convex minimization
problems. In particular, the proposed framework leads to random
block-coordinate versions of the Douglas-Rachford and forward-backward
algorithms and of some of their variants. In the standard case of block,
our results remain new as they incorporate stochastic perturbations
Parallel LQP alternating direction method for solving variational inequality problems with separable structure
In this paper, we propose a logarithmic-quadratic proximal alternating direction method for structured variational inequalities. The predictor is obtained by solving series of related systems of nonlinear equations, and the new iterate is obtained by a convex combination of the previous point and the one generated by a projection-type method along a new descent direction. Global convergence of the new method is proved under certain assumptions. Preliminary numerical experiments are included to verify the theoretical assertions of the proposed method.Qatar University Start-Up Grant: QUSG-CAS-DMSP-13/14-8.Scopu
Dynamical systems and forward-backward algorithms associated with the sum of a convex subdifferential and a monotone cocoercive operator
In a Hilbert framework, we introduce continuous and discrete dynamical
systems which aim at solving inclusions governed by structured monotone
operators , where is the subdifferential of a
convex lower semicontinuous function , and is a monotone cocoercive
operator. We first consider the extension to this setting of the regularized
Newton dynamic with two potentials. Then, we revisit some related dynamical
systems, namely the semigroup of contractions generated by , and the
continuous gradient projection dynamic. By a Lyapunov analysis, we show the
convergence properties of the orbits of these systems.
The time discretization of these dynamics gives various forward-backward
splitting methods (some new) for solving structured monotone inclusions
involving non-potential terms. The convergence of these algorithms is obtained
under classical step size limitation. Perspectives are given in the field of
numerical splitting methods for optimization, and multi-criteria decision
processes.Comment: 25 page
Semi-proximal Mirror-Prox for Nonsmooth Composite Minimization
We propose a new first-order optimisation algorithm to solve high-dimensional
non-smooth composite minimisation problems. Typical examples of such problems
have an objective that decomposes into a non-smooth empirical risk part and a
non-smooth regularisation penalty. The proposed algorithm, called Semi-Proximal
Mirror-Prox, leverages the Fenchel-type representation of one part of the
objective while handling the other part of the objective via linear
minimization over the domain. The algorithm stands in contrast with more
classical proximal gradient algorithms with smoothing, which require the
computation of proximal operators at each iteration and can therefore be
impractical for high-dimensional problems. We establish the theoretical
convergence rate of Semi-Proximal Mirror-Prox, which exhibits the optimal
complexity bounds, i.e. , for the number of calls to linear
minimization oracle. We present promising experimental results showing the
interest of the approach in comparison to competing methods
- …