Emergent Mind

The estimation performance of nonlinear least squares for phase retrieval

(1904.09711)
Published Apr 22, 2019 in cs.IT and math.IT

Abstract

Suppose that $\mathbf{y}=\lvert A\mathbf{x0}\rvert+\eta$ where $\mathbf{x0} \in \mathbb{R}d$ is the target signal and $\eta\in \mathbb{R}m$ is a noise vector. The aim of phase retrieval is to estimate $\mathbf{x0}$ from $\mathbf{y}$. A popular model for estimating $\mathbf{x0} $ is the nonlinear least square $ \widehat{\mathbf{x}}:={\rm argmin}{\mathbf{x}} | \lvert A \mathbf{x}\rvert-\mathbf{y}|2$. One already develops many efficient algorithms for solving the model, such as the seminal error reduction algorithm. In this paper, we present the estimation performance of the model with proving that $|\widehat{\mathbf{x}}-\mathbf{x0} |\lesssim {|\eta|2}/{\sqrt{m}}$ under the assumption of $A$ being a Gaussian random matrix. We also prove the reconstruction error ${|\eta|2}/{\sqrt{m}}$ is sharp. For the case where $\mathbf{x0}$ is sparse, we study the estimation performance of both the nonlinear Lasso of phase retrieval and its unconstrained version. Our results are non-asymptotic, and we do not assume any distribution on the noise $\eta$. To the best of our knowledge, our results represent the first theoretical guarantee for the nonlinear least square and for the nonlinear Lasso of phase retrieval.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.