Emergent Mind

Iterative Concave Rank Approximation for Recovering Low-Rank Matrices

(1504.01158)
Published Apr 5, 2015 in cs.IT and math.IT

Abstract

In this paper, we propose a new algorithm for recovery of low-rank matrices from compressed linear measurements. The underlying idea of this algorithm is to closely approximate the rank function with a smooth function of singular values, and then minimize the resulting approximation subject to the linear constraints. The accuracy of the approximation is controlled via a scaling parameter $\delta$, where a smaller $\delta$ corresponds to a more accurate fitting. The consequent optimization problem for any finite $\delta$ is nonconvex. Therefore, in order to decrease the risk of ending up in local minima, a series of optimizations is performed, starting with optimizing a rough approximation (a large $\delta$) and followed by successively optimizing finer approximations of the rank with smaller $\delta$'s. To solve the optimization problem for any $\delta > 0$, it is converted to a new program in which the cost is a function of two auxiliary positive semidefinete variables. The paper shows that this new program is concave and applies a majorize-minimize technique to solve it which, in turn, leads to a few convex optimization iterations. This optimization scheme is also equivalent to a reweighted Nuclear Norm Minimization (NNM), where weighting update depends on the used approximating function. For any $\delta > 0$, we derive a necessary and sufficient condition for the exact recovery which are weaker than those corresponding to NNM. On the numerical side, the proposed algorithm is compared to NNM and a reweighted NNM in solving affine rank minimization and matrix completion problems showing its considerable and consistent superiority in terms of success rate, especially, when the number of measurements decreases toward the lower-bound for the unique representation.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.