On the convergence analysis of DCA

Abstract

In this paper, we propose a clean and general proof framework to establish the convergence analysis of the Difference-of-Convex (DC) programming algorithm (DCA) for both standard DC program and convex constrained DC program. We first discuss suitable assumptions for the well-definiteness of DCA. Then, we focus on the convergence analysis of DCA, in particular, the global convergence of the sequence {xk}\{x^k\} generated by DCA under the Lojasiewicz subgradient inequality and the Kurdyka-Lojasiewicz property respectively. Moreover, the convergence rate for the sequences {f(xk)}\{f(x^k)\} and {βˆ₯xkβˆ’xβˆ—βˆ₯}\{\|x^k-x^*\|\} are also investigated. We hope that the proof framework presented in this article will be a useful tool to conveniently establish the convergence analysis for many variants of DCA and new DCA-type algorithms

    Similar works

    Full text

    thumbnail-image

    Available Versions