Emergent Mind

Information-Theoretic Limits of Matrix Completion

(1504.04970)
Published Apr 20, 2015 in cs.IT and math.IT

Abstract

We propose an information-theoretic framework for matrix completion. The theory goes beyond the low-rank structure and applies to general matrices of "low description complexity". Specifically, we consider $m\times n$ random matrices $\mathbf{X}$ of arbitrary distribution (continuous, discrete, discrete-continuous mixture, or even singular). With $\mathcal{S}$ an $\varepsilon$-support set of $\mathbf{X}$, i.e., $\mathrm{P}[\mathbf{X}\in\mathcal{S}]\geq 1-\varepsilon$, and $\underline{\mathrm{dim}}\mathrm{B}(\mathcal{S})$ denoting the lower Minkowski dimension of $\mathcal{S}$, we show that $k> \underline{\mathrm{dim}}\mathrm{B}(\mathcal{S})$ trace inner product measurements with measurement matrices $Ai$, suffice to recover $\mathbf{X}$ with probability of error at most $\varepsilon$. The result holds for Lebesgue a.a. $Ai$ and does not need incoherence between the $Ai$ and the unknown matrix $\mathbf{X}$. We furthermore show that $k> \underline{\mathrm{dim}}\mathrm{B}(\mathcal{S})$ measurements also suffice to recover the unknown matrix $\mathbf{X}$ from measurements taken with rank-one $Ai$, again this applies to a.a. rank-one $Ai$. Rank-one measurement matrices are attractive as they require less storage space than general measurement matrices and can be applied faster. Particularizing our results to the recovery of low-rank matrices, we find that $k>(m+n-r)r$ measurements are sufficient to recover matrices of rank at most $r$. Finally, we construct a class of rank-$r$ matrices that can be recovered with arbitrarily small probability of error from $k<(m+n-r)r$ measurements.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.