Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parameter Generation of Quantum Approximate Optimization Algorithm with Diffusion Model (2407.12242v3)

Published 17 Jul 2024 in quant-ph

Abstract: Quantum computing presents a compelling prospect for revolutionizing the field of combinatorial optimization, in virtue of the unique attributes of quantum mechanics such as superposition and entanglement. The Quantum Approximate Optimization Algorithm (QAOA), which is a variational hybrid quantum-classical algorithm, stands out as leading proposals to efficiently solve the Max-Cut problem, a representative example of combinatorial optimization. However, its promised advantages strongly rely on parameters initialization strategy, a critical aspect due to the non-convex and complex optimization landscapes characterized by low-quality local minima issues. Therefore, in this work, we formulate the problem of finding good initial parameters as a generative task in which the generative machine learning model, specifically the denoising diffusion probabilistic model (DDPM), is trained to generate high-performing initial parameters for QAOA. The diffusion model is capable of learning the distribution of high-performing parameters and then synthesizing new parameters closer to optimal ones. Experiments with various sized Max-Cut problem instances demonstrate that our diffusion process consistently enhances QAOA effectiveness compared to random parameters initialization. Moreover, our framework indicates the capacity of training on small, classically simulatable problem instances, aiming at extrapolating to larger instances to reduce quantum computational resource overhead.

Summary

We haven't generated a summary for this paper yet.