Emergent Mind

Geometry-Aware Universal Mirror-Prox

(2011.11203)
Published Nov 23, 2020 in cs.LG

Abstract

Mirror-prox (MP) is a well-known algorithm to solve variational inequality (VI) problems. VI with a monotone operator covers a large group of settings such as convex minimization, min-max or saddle point problems. To get a convergent algorithm, the step-size of the classic MP algorithm relies heavily on the problem dependent knowledge of the operator such as its smoothness parameter which is hard to estimate. Recently, a universal variant of MP for smooth/bounded operators has been introduced that depends only on the norm of updates in MP. In this work, we relax the dependence to evaluating the norm of updates to Bregman divergence between updates. This relaxation allows us to extends the analysis of universal MP to the settings where the operator is not smooth or bounded. Furthermore, we analyse the VI problem with a stochastic monotone operator in different settings and obtain an optimal rate up to a logarithmic factor.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.