Classical primal-dual algorithms attempt to solve maxμminxL(x,μ) by alternatively minimizing over the primal variable x
through primal descent and maximizing the dual variable μ through dual
ascent. However, when L(x,μ) is highly nonconvex with complex
constraints in x, the minimization over x may not achieve global
optimality, and hence the dual ascent step loses its valid intuition. This
observation motivates us to propose a new class of primal-dual algorithms for
nonconvex constrained optimization with the key feature to reverse dual ascent
to a conceptually new dual descent, in a sense, elevating the dual variable to
the same status as the primal variable. Surprisingly, this new dual scheme
achieves some best iteration complexities for solving nonconvex optimization
problems. In particular, when the dual descent step is scaled by a fractional
constant, we name it scaled dual descent (SDD), otherwise, unscaled dual
descent (UDD). For nonconvex multiblock optimization with nonlinear equality
constraints, we propose SDD-ADMM and show that it finds an
ϵ-stationary solution in O(ϵ−4) iterations. The
complexity is further improved to O(ϵ−3) and
O(ϵ−2) under proper conditions. We also propose UDD-ALM,
combining UDD with ALM, for weakly convex minimization over affine constraints.
We show that UDD-ALM finds an ϵ-stationary solution in
O(ϵ−2) iterations. These complexity bounds for both
algorithms either achieve or improve the best-known results in the ADMM and ALM
literature. Moreover, SDD-ADMM addresses a long-standing limitation of existing
ADMM frameworks