Emergent Mind

Sparsity-Dimension Trade-Offs for Oblivious Subspace Embeddings

(2212.02913)
Published Dec 6, 2022 in cs.DS , cs.CG , and cs.DM

Abstract

An oblivious subspace embedding (OSE), characterized by parameters $m,n,d,\epsilon,\delta$, is a random matrix $\Pi\in \mathbb{R}{m\times n}$ such that for any $d$-dimensional subspace $T\subseteq \mathbb{R}n$, $\Pr\Pi[\forall x\in T, (1-\epsilon)|x|2 \leq |\Pi x|2\leq (1+\epsilon)|x|2] \geq 1-\delta$. When an OSE has $s\le 1/2.001\epsilon$ nonzero entries in each column, we show it must hold that $m = \Omega\left(d2/( \epsilon2s{1+O(\delta)})\right)$, which is the first lower bound with multiplicative factors of $d2$ and $1/\epsilon$, improving on the previous $\Omega\left(d2/s{O(\delta)}\right)$ lower bound due to Li and Liu (PODS 2022). When an OSE has $s=\Omega(\log(1/\epsilon)/\epsilon)$ nonzero entries in each column, we show it must hold that $m = \Omega\left((d/\epsilon){1+1/4.001\epsilon s}/s{O(\delta)}\right)$, which is the first lower bound with multiplicative factors of $d$ and $1/\epsilon$, improving on the previous $\Omega\left(d{1+1/(16\epsilon s+4)}\right)$ lower bound due to Nelson and Nguyen (ICALP 2014). This second result is a special case of a more general trade-off among $d,\epsilon,s,\delta$ and $m$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.