Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
104 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
40 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging (1307.1879v1)

Published 7 Jul 2013 in math.OC and cs.SY

Abstract: This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. Specifically, by suitably choosing the stepsize values, one can obtain the rate of the order $1/k$ for strongly convex functions, and the rate $1/\sqrt{k}$ for general convex functions (not necessarily differentiable). Furthermore, for the latter case, it is shown that a stochastic subgradient mirror-descent with iterate averaging converges (along a subsequence) to an optimal solution, almost surely, even with the stepsize of the form $1/\sqrt{1+k}$, which was not previously known. The stepsize choices that achieve the best rates are those proposed by Paul Tseng for acceleration of proximal gradient methods.

Citations (114)

Summary

We haven't generated a summary for this paper yet.