In this paper, we consider a class of nonconvex (not necessarily
differentiable) optimization problems called generalized DC
(Difference-of-Convex functions) programming, which is minimizing the sum of
two separable DC parts and one two-block-variable coupling function. To
circumvent the nonconvexity and nonseparability of the problem under
consideration, we accordingly introduce a Unified Bregman Alternating
Minimization Algorithm (UBAMA) by maximally exploiting the favorable DC
structure of the objective. Specifically, we first follow the spirit of
alternating minimization to update each block variable in a sequential order,
which can efficiently tackle the nonseparablitity caused by the coupling
function. Then, we employ the Fenchel-Young inequality to approximate the
second DC components (i.e., concave parts) so that each subproblem reduces to a
convex optimization problem, thereby alleviating the computational burden of
the nonconvex DC parts. Moreover, each subproblem absorbs a Bregman proximal
regularization term, which is usually beneficial for inducing closed-form
solutions of subproblems for many cases via choosing appropriate Bregman kernel
functions. It is remarkable that our algorithm not only provides an algorithmic
framework to understand the iterative schemes of some novel existing
algorithms, but also enjoys implementable schemes with easier subproblems than
some state-of-the-art first-order algorithms developed for generic nonconvex
and nonsmooth optimization problems. Theoretically, we prove that the sequence
generated by our algorithm globally converges to a critical point under the
Kurdyka-{\L}ojasiewicz (K{\L}) condition. Besides, we estimate the local
convergence rates of our algorithm when we further know the prior information
of the K{\L} exponent.Comment: 44 pages, 7figures, 5 tables. Any comments are welcom