Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

O(logT) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions (1304.0740v1)

Published 2 Apr 2013 in cs.LG

Abstract: Traditional algorithms for stochastic optimization require projecting the solution at each iteration into a given domain to ensure its feasibility. When facing complex domains, such as positive semi-definite cones, the projection operation can be expensive, leading to a high computational cost per iteration. In this paper, we present a novel algorithm that aims to reduce the number of projections for stochastic optimization. The proposed algorithm combines the strength of several recent developments in stochastic optimization, including mini-batch, extra-gradient, and epoch gradient descent, in order to effectively explore the smoothness and strong convexity. We show, both in expectation and with a high probability, that when the objective function is both smooth and strongly convex, the proposed algorithm achieves the optimal $O(1/T)$ rate of convergence with only $O(\log T)$ projections. Our empirical study verifies the theoretical result.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Lijun Zhang (239 papers)
  2. Tianbao Yang (163 papers)
  3. Rong Jin (164 papers)
  4. Xiaofei He (70 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.