On the convergence analysis of DCA
(2211.10942)Abstract
In this paper, we propose a clean and general proof framework to establish the convergence analysis of the Difference-of-Convex (DC) programming algorithm (DCA) for both standard DC program and convex constrained DC program. We first discuss suitable assumptions for the well-definiteness of DCA. Then, we focus on the convergence analysis of DCA, in particular, the global convergence of the sequence ${xk}$ generated by DCA under the Lojasiewicz subgradient inequality and the Kurdyka-Lojasiewicz property respectively. Moreover, the convergence rate for the sequences ${f(xk)}$ and ${|xk-x*|}$ are also investigated. We hope that the proof framework presented in this article will be a useful tool to conveniently establish the convergence analysis for many variants of DCA and new DCA-type algorithms.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.