Emergent Mind

Critical Behavior and Universality Classes for an Algorithmic Phase Transition in Sparse Reconstruction

(1509.08995)
Published Sep 30, 2015 in cs.IT , cond-mat.stat-mech , and math.IT

Abstract

Recovery of an $N$-dimensional, $K$-sparse solution $\mathbf{x}$ from an $M$-dimensional vector of measurements $\mathbf{y}$ for multivariate linear regression can be accomplished by minimizing a suitably penalized least-mean-square cost $||\mathbf{y}-\mathbf{H} \mathbf{x}||22+\lambda V(\mathbf{x})$. Here $\mathbf{H}$ is a known matrix and $V(\mathbf{x})$ is an algorithm-dependent sparsity-inducing penalty. For `random' $\mathbf{H}$, in the limit $\lambda \rightarrow 0$ and $M,N,K\rightarrow \infty$, keeping $\rho=K/N$ and $\alpha=M/N$ fixed, exact recovery is possible for $\alpha$ past a critical value $\alphac = \alpha(\rho)$. Assuming $\mathbf{x}$ has iid entries, the critical curve exhibits some universality, in that its shape does not depend on the distribution of $\mathbf{x}$. However, the algorithmic phase transition occurring at $\alpha=\alphac$ and associated universality classes remain ill-understood from a statistical physics perspective, i.e. in terms of scaling exponents near the critical curve. In this article, we analyze the mean-field equations for two algorithms, Basis Pursuit ($V(\mathbf{x})=||\mathbf{x}||{1} $) and Elastic Net ($V(\mathbf{x})= ||\mathbf{x}||{1} + \tfrac{g}{2} ||\mathbf{x}||{2}2$) and show that they belong to different universality classes in the sense of scaling exponents, with Mean Squared Error (MSE) of the recovered vector scaling as $\lambda\frac{4}{3}$ and $\lambda$ respectively, for small $\lambda$ on the critical line. In the presence of additive noise, we find that, when $\alpha>\alphac$, MSE is minimized at a non-zero value for $\lambda$, whereas at $\alpha=\alphac$, MSE always increases with $\lambda$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.