Papers
Topics
Authors
Recent
2000 character limit reached

Asymptotic Analysis of LASSOs Solution Path with Implications for Approximate Message Passing (1309.5979v1)

Published 23 Sep 2013 in math.ST, cs.IT, math.IT, stat.ML, and stat.TH

Abstract: This paper concerns the performance of the LASSO (also knows as basis pursuit denoising) for recovering sparse signals from undersampled, randomized, noisy measurements. We consider the recovery of the signal $x_o \in \mathbb{R}N$ from $n$ random and noisy linear observations $y= Ax_o + w$, where $A$ is the measurement matrix and $w$ is the noise. The LASSO estimate is given by the solution to the optimization problem $x_o$ with $\hat{x}{\lambda} = \arg \min_x \frac{1}{2} |y-Ax|_22 + \lambda |x|_1$. Despite major progress in the theoretical analysis of the LASSO solution, little is known about its behavior as a function of the regularization parameter $\lambda$. In this paper we study two questions in the asymptotic setting (i.e., where $N \rightarrow \infty$, $n \rightarrow \infty$ while the ratio $n/N$ converges to a fixed number in $(0,1)$): (i) How does the size of the active set $|\hat{x}\lambda|0/N$ behave as a function of $\lambda$, and (ii) How does the mean square error $|\hat{x}{\lambda} - x_o|_22/N$ behave as a function of $\lambda$? We then employ these results in a new, reliable algorithm for solving LASSO based on approximate message passing (AMP).

Citations (20)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.