2000 character limit reached
Open Problem: Anytime Convergence Rate of Gradient Descent (2406.13888v1)
Published 19 Jun 2024 in math.OC and cs.LG
Abstract: Recent results show that vanilla gradient descent can be accelerated for smooth convex objectives, merely by changing the stepsize sequence. We show that this can lead to surprisingly large errors indefinitely, and therefore ask: Is there any stepsize schedule for gradient descent that accelerates the classic $\mathcal{O}(1/T)$ convergence rate, at \emph{any} stopping time $T$?
- Guy Kornowski (16 papers)
- Ohad Shamir (110 papers)