Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 148 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 85 tok/s Pro
Kimi K2 210 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Compressive Phase Retrieval via Generalized Approximate Message Passing (1405.5618v2)

Published 22 May 2014 in cs.IT and math.IT

Abstract: In phase retrieval, the goal is to recover a signal $\mathbf{x}\in\mathbb{C}N$ from the magnitudes of linear measurements $\mathbf{Ax}\in\mathbb{C}M$. While recent theory has established that $M\approx 4N$ intensity measurements are necessary and sufficient to recover generic $\mathbf{x}$, there is great interest in reducing the number of measurements through the exploitation of sparse $\mathbf{x}$, which is known as compressive phase retrieval. In this work, we detail a novel, probabilistic approach to compressive phase retrieval based on the generalized approximate message passing (GAMP) algorithm. We then present a numerical study of the proposed PR-GAMP algorithm, demonstrating its excellent phase-transition behavior, robustness to noise, and runtime. Our experiments suggest that approximately $M\geq 2K\log_2(N/K)$ intensity measurements suffice to recover $K$-sparse Bernoulli-Gaussian signals for $\mathbf{A}$ with i.i.d Gaussian entries and $K\ll N$. Meanwhile, when recovering a 6k-sparse 65k-pixel grayscale image from 32k randomly masked and blurred Fourier intensity measurements at 30~dB measurement SNR, PR-GAMP achieved an output SNR of no less than 28~dB in all of 100 random trials, with a median runtime of only 7.3 seconds. Compared to the recently proposed CPRL, sparse-Fienup, and GESPAR algorithms, our experiments suggest that PR-GAMP has a superior phase transition and orders-of-magnitude faster runtimes as the sparsity and problem dimensions increase.

Citations (26)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.