Emergent Mind

Abstract

Low-distortion embeddings are critical building blocks for developing random sampling and random projection algorithms for linear algebra problems. We show that, given a matrix $A \in \R{n \times d}$ with $n \gg d$ and a $p \in [1, 2)$, with a constant probability, we can construct a low-distortion embedding matrix $\Pi \in \R{O(\poly(d)) \times n}$ that embeds $\Ap$, the $\ellp$ subspace spanned by $A$'s columns, into $(\R{O(\poly(d))}, | \cdot |p)$; the distortion of our embeddings is only $O(\poly(d))$, and we can compute $\Pi A$ in $O(\nnz(A))$ time, i.e., input-sparsity time. Our result generalizes the input-sparsity time $\ell2$ subspace embedding by Clarkson and Woodruff [STOC'13]; and for completeness, we present a simpler and improved analysis of their construction for $\ell2$. These input-sparsity time $\ellp$ embeddings are optimal, up to constants, in terms of their running time; and the improved running time propagates to applications such as $(1\pm \epsilon)$-distortion $\ellp$ subspace embedding and relative-error $\ellp$ regression. For $\ell2$, we show that a $(1+\epsilon)$-approximate solution to the $\ell2$ regression problem specified by the matrix $A$ and a vector $b \in \Rn$ can be computed in $O(\nnz(A) + d3 \log(d/\epsilon) /\epsilon2)$ time; and for $\ellp$, via a subspace-preserving sampling procedure, we show that a $(1\pm \epsilon)$-distortion embedding of $\Ap$ into $\R{O(\poly(d))}$ can be computed in $O(\nnz(A) \cdot \log n)$ time, and we also show that a $(1+\epsilon)$-approximate solution to the $\ellp$ regression problem $\min{x \in \Rd} |A x - b|_p$ can be computed in $O(\nnz(A) \cdot \log n + \poly(d) \log(1/\epsilon)/\epsilon2)$ time. Moreover, we can improve the embedding dimension or equivalently the sample size to $O(d{3+p/2} \log(1/\epsilon) / \epsilon2)$ without increasing the complexity.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.