Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence of Extragradient SVRG for Variational Inequalities: Error Bounds and Increasing Iterate Averaging (2306.01796v2)

Published 1 Jun 2023 in math.OC and cs.GT

Abstract: We study the last-iterate convergence of variance reduction methods for extragradient (EG) algorithms for a class of variational inequalities satisfying error-bound conditions. Previously, last-iterate linear convergence was only known under strong monotonicity. We show that EG algorithms with SVRG-style variance reduction, denoted SVRG-EG, attain last-iterate linear convergence under a general error-bound condition much weaker than strong monotonicity. This condition captures a broad class of non-strongly monotone problems, such as bilinear saddle-point problems commonly encountered in two-player zero-sum Nash equilibrium computation. Next, we establish linear last-iterate convergence of SVRG-EG with an improved guarantee under the weak sharpness assumption. Furthermore, motivated by the empirical efficiency of increasing iterate averaging techniques in solving saddle-point problems, we also establish new convergence results for SVRG-EG with such techniques.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Tianlong Nan (4 papers)
  2. Yuan Gao (336 papers)
  3. Christian Kroer (83 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.