Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization (1206.6384v1)

Published 27 Jun 2012 in cs.LG and stat.ML

Abstract: We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining low-rank stochastic subgradients with efficient incremental SVD updates, made possible by highly optimized and parallelizable dense linear algebra operations on small matrices. Our practical algorithms always maintain a low-rank factorization of iterates that can be conveniently held in memory and efficiently multiplied to generate predictions in matrix completion settings. Empirical comparisons confirm that our approach is highly competitive with several recently proposed state-of-the-art solvers for such problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Haim Avron (51 papers)
  2. Satyen Kale (50 papers)
  3. Shiva Kasiviswanathan (5 papers)
  4. Vikas Sindhwani (60 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.