Emergent Mind

Adapting to Unknown Noise Distribution in Matrix Denoising

(1810.02954)
Published Oct 6, 2018 in math.ST , cs.IT , math.IT , stat.ME , and stat.TH

Abstract

We consider the problem of estimating an unknown matrix $\boldsymbol{X}\in {\mathbb R}{m\times n}$, from observations $\boldsymbol{Y} = \boldsymbol{X}+\boldsymbol{W}$ where $\boldsymbol{W}$ is a noise matrix with independent and identically distributed entries, as to minimize estimation error measured in operator norm. Assuming that the underlying signal $\boldsymbol{X}$ is low-rank and incoherent with respect to the canonical basis, we prove that minimax risk is equivalent to $(\sqrt{m}\vee\sqrt{n})/\sqrt{IW}$ in the high-dimensional limit $m,n\to\infty$, where $IW$ is the Fisher information of the noise. Crucially, we develop an efficient procedure that achieves this risk, adaptively over the noise distribution (under certain regularity assumptions). Letting $\boldsymbol{X} = \boldsymbol{U}{\boldsymbol{\Sigma}}\boldsymbol{V}{{\sf T}}$ --where $\boldsymbol{U}\in {\mathbb R}{m\times r}$, $\boldsymbol{V}\in{\mathbb R}{n\times r}$ are orthogonal, and $r$ is kept fixed as $m,n\to\infty$-- we use our method to estimate $\boldsymbol{U}$, $\boldsymbol{V}$. Standard spectral methods provide non-trivial estimates of the factors $\boldsymbol{U},\boldsymbol{V}$ (weak recovery) only if the singular values of $\boldsymbol{X}$ are larger than $(mn){1/4}{\rm Var}(W{11}){1/2}$. We prove that the new approach achieves weak recovery down to the the information-theoretically optimal threshold $(mn){1/4}IW{1/2}$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.