Stochastic Primal-Dual Hybrid Gradient (SPDHG) is an algorithm to efficiently
solve a wide class of nonsmooth large-scale optimization problems. In this
paper we contribute to its theoretical foundations and prove its almost sure
convergence for convex but neither necessarily strongly convex nor smooth
functionals. We also prove its convergence for any sampling. In addition, we
study SPDHG for parallel Magnetic Resonance Imaging reconstruction, where data
from different coils are randomly selected at each iteration. We apply SPDHG
using a wide range of random sampling methods and compare its performance
across a range of settings, including mini-batch size and step size parameters.
We show that the sampling can significantly affect the convergence speed of
SPDHG and for many cases an optimal sampling can be identified