120 research outputs found
The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than
The {\it forward-backward algorithm} is a powerful tool for solving
optimization problems with a {\it additively separable} and {\it smooth} + {\it
nonsmooth} structure. In the convex setting, a simple but ingenious
acceleration scheme developed by Nesterov has been proved useful to improve the
theoretical rate of convergence for the function values from the standard
down to . In this short paper, we
prove that the rate of convergence of a slight variant of Nesterov's
accelerated forward-backward method, which produces {\it convergent} sequences,
is actually , rather than . Our arguments rely
on the connection between this algorithm and a second-order differential
inclusion with vanishing damping
Splitting methods with variable metric for KL functions
We study the convergence of general abstract descent methods applied to a
lower semicontinuous nonconvex function f that satisfies the
Kurdyka-Lojasiewicz inequality in a Hilbert space. We prove that any precompact
sequence converges to a critical point of f and obtain new convergence rates
both for the values and the iterates. The analysis covers alternating versions
of the forward-backward method with variable metric and relative errors. As an
example, a nonsmooth and nonconvex version of the Levenberg-Marquardt algorithm
is detailled
Fast convex optimization via inertial dynamics with Hessian driven damping
We first study the fast minimization properties of the trajectories of the
second-order evolution equation where
is a smooth convex function acting on a real
Hilbert space , and , are positive parameters. This
inertial system combines an isotropic viscous damping which vanishes
asymptotically, and a geometrical Hessian driven damping, which makes it
naturally related to Newton's and Levenberg-Marquardt methods. For , , along any trajectory, fast convergence of the values
is
obtained, together with rapid convergence of the gradients
to zero. For , just assuming that has minimizers, we show that
any trajectory converges weakly to a minimizer of , and . Strong convergence is
established in various practical situations. For the strongly convex case,
convergence can be arbitrarily fast depending on the choice of . More
precisely, we have . We extend the results to the case of a general
proper lower-semicontinuous convex function . This is based on the fact that the inertial
dynamic with Hessian driven damping can be written as a first-order system in
time and space. By explicit-implicit time discretization, this opens a gate to
new possibly more rapid inertial algorithms, expanding the field of
FISTA methods for convex structured optimization problems
Forward-backward approximation of nonlinear semigroups in finite and infinite horizon
International audienc
Inertial Krasnoselskii-Mann Iterations
We establish the weak convergence of inertial Krasnoselskii-Mann iterations
towards a common fixed point of a family of quasi-nonexpansive operators, along
with worst case estimates for the rate at which the residuals vanish. Strong
and linear convergence are obtained in the quasi-contractive setting. In both
cases, we highlight the relationship with the non-inertial case, and show that
passing from one regime to the other is a continuous process in terms of
parameter hypotheses and convergence rates. Numerical illustrations for an
inertial primaldual method and an inertial three-operator splitting algorithm,
whose performance is superior to that of their non-inertial counterparts
- …