Emergent Mind

OptShrink: An algorithm for improved low-rank signal matrix denoising by optimal, data-driven singular value shrinkage

(1306.6042)
Published Jun 25, 2013 in math.ST , cs.IT , math.IT , stat.ML , and stat.TH

Abstract

The truncated singular value decomposition (SVD) of the measurement matrix is the optimal solution to therepresentation problem of how to best approximate a noisy measurement matrix using a low-rank matrix. Here, we consider the (unobservable)denoising problem of how to best approximate a low-rank signal matrix buried in noise by optimal (re)weighting of the singular vectors of the measurement matrix. We exploit recent results from random matrix theory to exactly characterize the large matrix limit of the optimal weighting coefficients and show that they can be computed directly from data for a large class of noise models that includes the i.i.d. Gaussian noise case. Our analysis brings into sharp focus the shrinkage-and-thresholding form of the optimal weights, the non-convex nature of the associated shrinkage function (on the singular values) and explains why matrix regularization via singular value thresholding with convex penalty functions (such as the nuclear norm) will always be suboptimal. We validate our theoretical predictions with numerical simulations, develop an implementable algorithm (OptShrink) that realizes the predicted performance gains and show how our methods can be used to improve estimation in the setting where the measured matrix has missing entries.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.