Emergent Mind

A quantum-classical performance separation in nonconvex optimization

(2311.00811)
Published Nov 1, 2023 in quant-ph , cs.DS , cs.LG , and math.OC

Abstract

In this paper, we identify a family of nonconvex continuous optimization instances, each $d$-dimensional instance with $2d$ local minima, to demonstrate a quantum-classical performance separation. Specifically, we prove that the recently proposed Quantum Hamiltonian Descent (QHD) algorithm [Leng et al., arXiv:2303.01471] is able to solve any $d$-dimensional instance from this family using $\widetilde{\mathcal{O}}(d3)$ quantum queries to the function value and $\widetilde{\mathcal{O}}(d4)$ additional 1-qubit and 2-qubit elementary quantum gates. On the other side, a comprehensive empirical study suggests that representative state-of-the-art classical optimization algorithms/solvers (including Gurobi) would require a super-polynomial time to solve such optimization instances.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.