Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 174 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 38 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 438 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Sharp MSE Bounds for Proximal Denoising (1305.2714v5)

Published 13 May 2013 in cs.IT, math.IT, and math.OC

Abstract: Denoising has to do with estimating a signal $x_0$ from its noisy observations $y=x_0+z$. In this paper, we focus on the "structured denoising problem", where the signal $x_0$ possesses a certain structure and $z$ has independent normally distributed entries with mean zero and variance $\sigma2$. We employ a structure-inducing convex function $f(\cdot)$ and solve $\min_x{\frac{1}{2}|y-x|22+\sigma\lambda f(x)}$ to estimate $x_0$, for some $\lambda>0$. Common choices for $f(\cdot)$ include the $\ell_1$ norm for sparse vectors, the $\ell_1-\ell_2$ norm for block-sparse signals and the nuclear norm for low-rank matrices. The metric we use to evaluate the performance of an estimate $x*$ is the normalized mean-squared-error $\text{NMSE}(\sigma)=\frac{\mathbb{E}|x*-x_0|_22}{\sigma2}$. We show that NMSE is maximized as $\sigma\rightarrow 0$ and we find the \emph{exact} worst case NMSE, which has a simple geometric interpretation: the mean-squared-distance of a standard normal vector to the $\lambda$-scaled subdifferential $\lambda\partial f(x_0)$. When $\lambda$ is optimally tuned to minimize the worst-case NMSE, our results can be related to the constrained denoising problem $\min{f(x)\leq f(x_0)}{|y-x|2}$. The paper also connects these results to the generalized LASSO problem, in which, one solves $\min{f(x)\leq f(x_0)}{|y-Ax|_2}$ to estimate $x_0$ from noisy linear observations $y=Ax_0+z$. We show that certain properties of the LASSO problem are closely related to the denoising problem. In particular, we characterize the normalized LASSO cost and show that it exhibits a "phase transition" as a function of number of observations. Our results are significant in two ways. First, we find a simple formula for the performance of a general convex estimator. Secondly, we establish a connection between the denoising and linear inverse problems.

Citations (8)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.