Emergent Mind

Rigorous Restricted Isometry Property of Low-Dimensional Subspaces

(1801.10058)
Published Jan 30, 2018 in cs.IT , cs.LG , and math.IT

Abstract

Dimensionality reduction is in demand to reduce the complexity of solving large-scale problems with data lying in latent low-dimensional structures in machine learning and computer version. Motivated by such need, in this work we study the Restricted Isometry Property (RIP) of Gaussian random projections for low-dimensional subspaces in $\mathbb{R}N$, and rigorously prove that the projection Frobenius norm distance between any two subspaces spanned by the projected data in $\mathbb{R}n$ ($n<N$) remain almost the same as the distance between the original subspaces with probability no less than $1 - {\rm e}{-\mathcal{O}(n)}$. Previously the well-known Johnson-Lindenstrauss (JL) Lemma and RIP for sparse vectors have been the foundation of sparse signal processing including Compressed Sensing. As an analogy to JL Lemma and RIP for sparse vectors, this work allows the use of random projections to reduce the ambient dimension with the theoretical guarantee that the distance between subspaces after compression is well preserved.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.