Emergent Mind

Complete Dictionary Recovery over the Sphere II: Recovery by Riemannian Trust-region Method

(1511.04777)
Published Nov 15, 2015 in cs.IT , cs.CV , math.IT , math.OC , and stat.ML

Abstract

We consider the problem of recovering a complete (i.e., square and invertible) matrix $\mathbf A0$, from $\mathbf Y \in \mathbb{R}{n \times p}$ with $\mathbf Y = \mathbf A0 \mathbf X0$, provided $\mathbf X0$ is sufficiently sparse. This recovery problem is central to theoretical understanding of dictionary learning, which seeks a sparse representation for a collection of input signals and finds numerous applications in modern signal processing and machine learning. We give the first efficient algorithm that provably recovers $\mathbf A0$ when $\mathbf X0$ has $O(n)$ nonzeros per column, under suitable probability model for $\mathbf X0$. Our algorithmic pipeline centers around solving a certain nonconvex optimization problem with a spherical constraint, and hence is naturally phrased in the language of manifold optimization. In a companion paper (arXiv:1511.03607), we have showed that with high probability our nonconvex formulation has no "spurious" local minimizers and around any saddle point the objective function has a negative directional curvature. In this paper, we take advantage of the particular geometric structure, and describe a Riemannian trust region algorithm that provably converges to a local minimizer with from arbitrary initializations. Such minimizers give excellent approximations to rows of $\mathbf X0$. The rows are then recovered by linear programming rounding and deflation.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.