In this short note, we provide a simple version of an accelerated
forward-backward method (a.k.a. Nesterov's accelerated proximal gradient
method) possibly relying on approximate proximal operators and allowing to
exploit strong convexity of the objective function. The method supports both
relative and absolute errors, and its behavior is illustrated on a set of
standard numerical experiments. Using the same developments, we further provide
a version of the accelerated proximal hybrid extragradient method of Monteiro
and Svaiter (2013) possibly exploiting strong convexity of the objective
function.Comment: Minor modifications in notations and acknowledgments. These methods
were originally presented in arXiv:2006.06041v2. Code available at
https://github.com/mathbarre/StronglyConvexForwardBackwar