Recently, several authors have shown local and global convergence rate
results for Douglas-Rachford splitting under strong monotonicity, Lipschitz
continuity, and cocoercivity assumptions. Most of these focus on the convex
optimization setting. In the more general monotone inclusion setting, Lions and
Mercier showed a linear convergence rate bound under the assumption that one of
the two operators is strongly monotone and Lipschitz continuous. We show that
this bound is not tight, meaning that no problem from the considered class
converges exactly with that rate. In this paper, we present tight global linear
convergence rate bounds for that class of problems. We also provide tight
linear convergence rate bounds under the assumptions that one of the operators
is strongly monotone and cocoercive, and that one of the operators is strongly
monotone and the other is cocoercive. All our linear convergence results are
obtained by proving the stronger property that the Douglas-Rachford operator is
contractive