Parallel and Distributed Successive Convex Approximation Methods for Big-Data Optimization (1805.06963v1)
Abstract: Recent years have witnessed a surge of interest in parallel and distributed optimization methods for large-scale systems. In particular, nonconvex large-scale optimization problems have found a wide range of applications in several engineering fields. The design and the analysis of such complex, large-scale, systems pose several challenges and call for the development of new optimization models and algorithms. The major contribution of this paper is to put forth a general, unified, algorithmic framework, based on Successive Convex Approximation (SCA) techniques, for the parallel and distributed solution of a general class of non-convex constrained (non-separable, networked) problems. The presented framework unifies and generalizes several existing SCA methods, making them appealing for a parallel/distributed implementation while offering a flexible selection of function approximants, step size schedules, and control of the computation/communication efficiency. This paper is organized according to the lectures that one of the authors delivered at the CIME Summer School on Centralized and Distributed Multi-agent Optimization Models and Algorithms, held in Cetraro, Italy, June 23--27, 2014. These lectures are: I) Successive Convex Approximation Methods: Basics; II) Parallel Successive Convex Approximation Methods; and III) Distributed Successive Convex Approximation Methods.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.