Emergent Mind

No quantum speedup over gradient descent for non-smooth convex optimization

(2010.01801)
Published Oct 5, 2020 in cs.DS , math.OC , and quant-ph

Abstract

We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function $f:\mathbb{R}n \to \mathbb{R}$ and its (sub)gradient. Our goal is to find an $\epsilon$-approximate minimum of $f$ starting from a point that is distance at most $R$ from the true minimum. If $f$ is $G$-Lipschitz, then the classic gradient descent algorithm solves this problem with $O((GR/\epsilon){2})$ queries. Importantly, the number of queries is independent of the dimension $n$ and gradient descent is optimal in this regard: No deterministic or randomized algorithm can achieve better complexity that is still independent of the dimension $n$. In this paper we reprove the randomized lower bound of $\Omega((GR/\epsilon){2})$ using a simpler argument than previous lower bounds. We then show that although the function family used in the lower bound is hard for randomized algorithms, it can be solved using $O(GR/\epsilon)$ quantum queries. We then show an improved lower bound against quantum algorithms using a different set of instances and establish our main result that in general even quantum algorithms need $\Omega((GR/\epsilon)2)$ queries to solve the problem. Hence there is no quantum speedup over gradient descent for black-box first-order convex optimization without further assumptions on the function family.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.