Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Linear Convergence of Stochastic Iterative Greedy Algorithms with Sparse Constraints (1407.0088v1)

Published 1 Jul 2014 in math.NA, cs.IT, math.IT, and math.OC

Abstract: Motivated by recent work on stochastic gradient descent methods, we develop two stochastic variants of greedy algorithms for possibly non-convex optimization problems with sparsity constraints. We prove linear convergence in expectation to the solution within a specified tolerance. This generalized framework applies to problems such as sparse signal recovery in compressed sensing, low-rank matrix recovery, and covariance matrix estimation, giving methods with provable convergence guarantees that often outperform their deterministic counterparts. We also analyze the settings where gradients and projections can only be computed approximately, and prove the methods are robust to these approximations. We include many numerical experiments which align with the theoretical analysis and demonstrate these improvements in several different settings.

Citations (96)

Summary

We haven't generated a summary for this paper yet.