Emergent Mind

Randomized Block-Coordinate Optimistic Gradient Algorithms for Root-Finding Problems

(2301.03113)
Published Jan 8, 2023 in math.OC and stat.ML

Abstract

In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations in large-scale settings, which are called root-finding problems. Our first algorithm is non-accelerated with constant stepsizes, and achieves $\mathcal{O}(1/k)$ best-iterate convergence rate on $\mathbb{E}[ \Vert Gxk\Vert2]$ when the underlying operator $G$ is Lipschitz continuous and satisfies a weak Minty solution condition, where $\mathbb{E}[\cdot]$ is the expectation and $k$ is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both $\mathcal{O}(1/k2)$ and $o(1/k2)$ last-iterate convergence rates on both $\mathbb{E}[ \Vert Gxk\Vert2]$ and $\mathbb{E}[ \Vert x{k+1} - x{k}\Vert2]$ for this algorithm under the co-coerciveness of $G$. In addition, we prove that the iterate sequence ${xk}$ converges to a solution almost surely, and $\Vert Gxk\Vert2$ attains a $o(1/k)$ almost sure convergence rate. Then, we apply our methods to a class of large-scale finite-sum inclusions, which covers prominent applications in machine learning, statistical learning, and network optimization, especially in federated learning. We obtain two new federated learning-type algorithms and their convergence rate guarantees for solving this problem class.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.