Emergent Mind

Lower Bounds for Sparse Oblivious Subspace Embeddings

(2112.10987)
Published Dec 21, 2021 in cs.DS , cs.CG , and cs.DM

Abstract

An oblivious subspace embedding (OSE), characterized by parameters $m,n,d,\epsilon,\delta$, is a random matrix $\Pi\in \mathbb{R}{m\times n}$ such that for any $d$-dimensional subspace $T\subseteq \mathbb{R}n$, $\Pr\Pi[\forall x\in T, (1-\epsilon)|x|2 \leq |\Pi x|2\leq (1+\epsilon)|x|2] \geq 1-\delta$. For $\epsilon$ and $\delta$ at most a small constant, we show that any OSE with one nonzero entry in each column must satisfy that $m = \Omega(d2/(\epsilon2\delta))$, establishing the optimality of the classical Count-Sketch matrix. When an OSE has $1/(9\epsilon)$ nonzero entries in each column, we show it must hold that $m = \Omega(\epsilon{O(\delta)} d2)$, improving on the previous $\Omega(\epsilon2 d2)$ lower bound due to Nelson and Nguyen (ICALP 2014).

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.