Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 41 tok/s Pro
GPT-5 High 39 tok/s Pro
GPT-4o 89 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Randomized Coordinate Subgradient Method for Nonsmooth Composite Optimization (2206.14981v3)

Published 30 Jun 2022 in math.OC and cs.LG

Abstract: Coordinate-type subgradient methods for addressing nonsmooth optimization problems are relatively underexplored due to the set-valued nature of the subdifferential. In this work, our study focuses on nonsmooth composite optimization problems, encompassing a wide class of convex and weakly convex (nonconvex nonsmooth) problems. By utilizing the chain rule of the composite structure properly, we introduce the Randomized Coordinate Subgradient method (RCS) for tackling this problem class. To the best of our knowledge, this is the first coordinate subgradient method for solving general nonsmooth composite optimization problems. In theory, we consider the linearly bounded subgradients assumption for the objective function, which is more general than the traditional Lipschitz continuity assumption, to account for practical scenarios. We then conduct convergence analysis for RCS in both convex and weakly convex cases based on this generalized Lipschitz-type assumption. Specifically, we establish the $\widetilde{\mathcal{O}}$$(1/\sqrt{k})$ convergence rate in expectation and the $\tilde o(1/\sqrt{k})$ almost sure asymptotic convergence rate in terms of the suboptimality gap when $f$ is convex. For the case when $f$ is weakly convex and its subdifferential satisfies the global metric subregularity property, we derive the $\mathcal{O}(\varepsilon{-4})$ iteration complexity in expectation. We also establish an asymptotic convergence result. To justify the global metric subregularity property utilized in the analysis, we establish this error bound condition for the concrete (real-valued) robust phase retrieval problem. We also provide a convergence lemma and the relationship between the global metric subregularity properties of a weakly convex function and its Moreau envelope. Finally, we conduct several experiments to demonstrate the possible superiority of RCS over the subgradient method.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.