In this paper, we propose a new primal-dual algorithmic framework for a class
of convex-concave saddle point problems frequently arising from image
processing and machine learning. Our algorithmic framework updates the primal
variable between the twice calculations of the dual variable, thereby appearing
a symmetric iterative scheme, which is accordingly called the {\bf s}ymmetric
{\bf p}r{\bf i}mal-{\bf d}ual {\bf a}lgorithm (SPIDA). It is noteworthy that
the subproblems of our SPIDA are equipped with Bregman proximal regularization
terms, which make SPIDA versatile in the sense that it enjoys an algorithmic
framework covering some existing algorithms such as the classical augmented
Lagrangian method (ALM), linearized ALM, and Jacobian splitting algorithms for
linearly constrained optimization problems. Besides, our algorithmic framework
allows us to derive some customized versions so that SPIDA works as efficiently
as possible for structured optimization problems. Theoretically, under some
mild conditions, we prove the global convergence of SPIDA and estimate the
linear convergence rate under a generalized error bound condition defined by
Bregman distance. Finally, a series of numerical experiments on the matrix
game, basis pursuit, robust principal component analysis, and image restoration
demonstrate that our SPIDA works well on synthetic and real-world datasets.Comment: 32 pages; 5 figure; 7 table