We first study the fast minimization properties of the trajectories of the
second-order evolution equation x¨(t)+tαx˙(t)+β∇2Φ(x(t))x˙(t)+∇Φ(x(t))=0, where
Φ:H→R is a smooth convex function acting on a real
Hilbert space H, and α, β are positive parameters. This
inertial system combines an isotropic viscous damping which vanishes
asymptotically, and a geometrical Hessian driven damping, which makes it
naturally related to Newton's and Levenberg-Marquardt methods. For α≥3, β>0, along any trajectory, fast convergence of the values
Φ(x(t))−HminΦ=O(t−2) is
obtained, together with rapid convergence of the gradients ∇Φ(x(t))
to zero. For α>3, just assuming that Φ has minimizers, we show that
any trajectory converges weakly to a minimizer of Φ, and Φ(x(t))−minHΦ=o(t−2). Strong convergence is
established in various practical situations. For the strongly convex case,
convergence can be arbitrarily fast depending on the choice of α. More
precisely, we have Φ(x(t))−minHΦ=O(t−32α). We extend the results to the case of a general
proper lower-semicontinuous convex function Φ:H→R∪{+∞}. This is based on the fact that the inertial
dynamic with Hessian driven damping can be written as a first-order system in
time and space. By explicit-implicit time discretization, this opens a gate to
new − possibly more rapid − inertial algorithms, expanding the field of
FISTA methods for convex structured optimization problems