An Expectation-Maximization Approach to Tuning Generalized Vector Approximate Message Passing (1806.10079v1)
Abstract: Generalized Vector Approximate Message Passing (GVAMP) is an efficient iterative algorithm for approximately minimum-mean-squared-error estimation of a random vector $\mathbf{x}\sim p_{\mathbf{x}}(\mathbf{x})$ from generalized linear measurements, i.e., measurements of the form $\mathbf{y}=Q(\mathbf{z})$ where $\mathbf{z}=\mathbf{Ax}$ with known $\mathbf{A}$, and $Q(\cdot)$ is a noisy, potentially nonlinear, componentwise function. Problems of this form show up in numerous applications, including robust regression, binary classification, quantized compressive sensing, and phase retrieval. In some cases, the prior $p_{\mathbf{x}}$ and/or channel $Q(\cdot)$ depend on unknown deterministic parameters $\boldsymbol{\theta}$, which prevents a direct application of GVAMP. In this paper we propose a way to combine expectation maximization (EM) with GVAMP to jointly estimate $\mathbf{x}$ and $\boldsymbol{\theta}$. We then demonstrate how EM-GVAMP can solve the phase retrieval problem with unknown measurement-noise variance.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.