Incremental Methods for Weakly Convex Optimization (1907.11687v2)
Abstract: Incremental methods are widely utilized for solving finite-sum optimization problems in machine learning and signal processing. In this paper, we study a family of incremental methods -- including incremental subgradient, incremental proximal point, and incremental prox-linear methods -- for solving weakly convex optimization problems. Such a problem class covers many nonsmooth nonconvex instances that arise in engineering fields. We show that the three said incremental methods have an iteration complexity of $O(\varepsilon{-4})$ for driving a natural stationarity measure to below $\varepsilon$. Moreover, we show that if the weakly convex function satisfies a sharpness condition, then all three incremental methods, when properly initialized and equipped with geometrically diminishing stepsizes, can achieve a local linear rate of convergence. Our work is the first to extend the convergence rate analysis of incremental methods from the nonsmooth convex regime to the weakly convex regime. Lastly, we conduct numerical experiments on the robust matrix sensing problem to illustrate the convergence performance of the three incremental methods.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.