953 research outputs found
The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than
The {\it forward-backward algorithm} is a powerful tool for solving
optimization problems with a {\it additively separable} and {\it smooth} + {\it
nonsmooth} structure. In the convex setting, a simple but ingenious
acceleration scheme developed by Nesterov has been proved useful to improve the
theoretical rate of convergence for the function values from the standard
down to . In this short paper, we
prove that the rate of convergence of a slight variant of Nesterov's
accelerated forward-backward method, which produces {\it convergent} sequences,
is actually , rather than . Our arguments rely
on the connection between this algorithm and a second-order differential
inclusion with vanishing damping
Dynamical systems and forward-backward algorithms associated with the sum of a convex subdifferential and a monotone cocoercive operator
In a Hilbert framework, we introduce continuous and discrete dynamical
systems which aim at solving inclusions governed by structured monotone
operators , where is the subdifferential of a
convex lower semicontinuous function , and is a monotone cocoercive
operator. We first consider the extension to this setting of the regularized
Newton dynamic with two potentials. Then, we revisit some related dynamical
systems, namely the semigroup of contractions generated by , and the
continuous gradient projection dynamic. By a Lyapunov analysis, we show the
convergence properties of the orbits of these systems.
The time discretization of these dynamics gives various forward-backward
splitting methods (some new) for solving structured monotone inclusions
involving non-potential terms. The convergence of these algorithms is obtained
under classical step size limitation. Perspectives are given in the field of
numerical splitting methods for optimization, and multi-criteria decision
processes.Comment: 25 page
Asymptotic behavior of gradient-like dynamical systems involving inertia and multiscale aspects
In a Hilbert space , we study the asymptotic behaviour, as time
variable goes to , of nonautonomous gradient-like dynamical
systems involving inertia and multiscale features.
Given a general Hilbert space, and two convex
differentiable functions, a positive damping parameter, and a function of which tends to zero as goes to , we
consider the second-order differential equation This
system models the emergence of various collective behaviors in game theory, as
well as the asymptotic control of coupled nonlinear oscillators. Assuming that
tends to zero moderately slowly as goes to infinity, we show
that the trajectories converge weakly in . The limiting equilibria
are solutions of the hierarchical minimization problem which consists in
minimizing over the set of minimizers of . As key assumptions,
we suppose that and that, for
every belonging to a convex cone depending on the data
and where is
the Fenchel conjugate of , and is the support function of
. An application is given to coupled oscillators
Asymptotic behavior of coupled dynamical systems with multiscale aspects
We study the asymptotic behavior, as time t goes to infinity, of
nonautonomous dynamical systems involving multiscale features. These systems
model the emergence of various collective behaviors in game theory, as well as
the asymptotic control of coupled sytems.Comment: 20 page
Fast convex optimization via inertial dynamics with Hessian driven damping
We first study the fast minimization properties of the trajectories of the
second-order evolution equation where
is a smooth convex function acting on a real
Hilbert space , and , are positive parameters. This
inertial system combines an isotropic viscous damping which vanishes
asymptotically, and a geometrical Hessian driven damping, which makes it
naturally related to Newton's and Levenberg-Marquardt methods. For , , along any trajectory, fast convergence of the values
is
obtained, together with rapid convergence of the gradients
to zero. For , just assuming that has minimizers, we show that
any trajectory converges weakly to a minimizer of , and . Strong convergence is
established in various practical situations. For the strongly convex case,
convergence can be arbitrarily fast depending on the choice of . More
precisely, we have . We extend the results to the case of a general
proper lower-semicontinuous convex function . This is based on the fact that the inertial
dynamic with Hessian driven damping can be written as a first-order system in
time and space. By explicit-implicit time discretization, this opens a gate to
new possibly more rapid inertial algorithms, expanding the field of
FISTA methods for convex structured optimization problems
- …