Emergent Mind

The local convexity of solving systems of quadratic equations

(1506.07868)
Published Jun 25, 2015 in math.NA , math.OC , and stat.ML

Abstract

This paper considers the recovery of a rank $r$ positive semidefinite matrix $X XT\in\mathbb{R}{n\times n}$ from $m$ scalar measurements of the form $yi := aiT X XT ai$ (i.e., quadratic measurements of $X$). Such problems arise in a variety of applications, including covariance sketching of high-dimensional data streams, quadratic regression, quantum state tomography, among others. A natural approach to this problem is to minimize the loss function $f(U) = \sumi (yi - aiTUUTa_i)2$ which has an entire manifold of solutions given by ${XO}{O\in\mathcal{O}r}$ where $\mathcal{O}r$ is the orthogonal group of $r\times r$ orthogonal matrices; this is {\it non-convex} in the $n\times r$ matrix $U$, but methods like gradient descent are simple and easy to implement (as compared to semidefinite relaxation approaches). In this paper we show that once we have $m \geq C nr \log2(n)$ samples from isotropic gaussian $ai$, with high probability {\em (a)} this function admits a dimension-independent region of {\em local strong convexity} on lines perpendicular to the solution manifold, and {\em (b)} with an additional polynomial factor of $r$ samples, a simple spectral initialization will land within the region of convexity with high probability. Together, this implies that gradient descent with initialization (but no re-sampling) will converge linearly to the correct $X$, up to an orthogonal transformation. We believe that this general technique (local convexity reachable by spectral initialization) should prove applicable to a broader class of nonconvex optimization problems.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.