Emergent Mind

Sparsity Lower Bounds for Dimensionality Reducing Maps

(1211.0995)
Published Nov 5, 2012 in cs.DS , cs.IT , and math.IT

Abstract

We give near-tight lower bounds for the sparsity required in several dimensionality reducing linear maps. First, consider the JL lemma which states that for any set of n vectors in R there is a matrix A in R{m x d} with m = O(eps{-2}log n) such that mapping by A preserves pairwise Euclidean distances of these n vectors up to a 1 +/- eps factor. We show that there exists a set of n vectors such that any such matrix A with at most s non-zero entries per column must have s = Omega(eps{-1}log n/log(1/eps)) as long as m < O(n/log(1/eps)). This bound improves the lower bound of Omega(min{eps{-2}, eps{-1}sqrt{log_m d}}) by [Dasgupta-Kumar-Sarlos, STOC 2010], which only held against the stronger property of distributional JL, and only against a certain restricted class of distributions. Meanwhile our lower bound is against the JL lemma itself, with no restrictions. Our lower bound matches the sparse Johnson-Lindenstrauss upper bound of [Kane-Nelson, SODA 2012] up to an O(log(1/eps)) factor. Next, we show that any m x n matrix with the k-restricted isometry property (RIP) with constant distortion must have at least Omega(klog(n/k)) non-zeroes per column if the number of the rows is the optimal value m = O(klog (n/k)), and if k < n/polylog n. This improves the previous lower bound of Omega(min{k, n/m}) by [Chandar, 2010] and shows that for virtually all k it is impossible to have a sparse RIP matrix with an optimal number of rows. Lastly, we show that any oblivious distribution over subspace embedding matrices with 1 non-zero per column and preserving all distances in a d dimensional-subspace up to a constant factor with constant probability must have at least Omega(d2) rows. This matches one of the upper bounds in [Nelson-Nguyen, 2012] and shows the impossibility of obtaining the best of both of constructions in that work, namely 1 non-zero per column and ~O(d) rows.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.