The Johnson-Lindenstrauss Lemma for Clustering and Subspace Approximation: From Coresets to Dimension Reduction (2205.00371v3)
Abstract: We study the effect of Johnson-Lindenstrauss transforms in various projective clustering problems, generalizing recent results which only applied to center-based clustering [MMR19]. We ask the general question: for a Euclidean optimization problem and an accuracy parameter $\epsilon \in (0, 1)$, what is the smallest target dimension $t \in \mathbb{N}$ such that a Johnson-Lindenstrauss transform $\Pi \colon \mathbb{R}d \to \mathbb{R}t$ preserves the cost of the optimal solution up to a $(1+\epsilon)$-factor. We give a new technique which uses coreset constructions to analyze the effect of the Johnson-Lindenstrauss transform. Our technique, in addition applying to center-based clustering, improves on (or is the first to address) other Euclidean optimization problems, including: $\bullet$ For $(k,z)$-subspace approximation: we show that $t = \tilde{O}(zk2 / \epsilon3)$ suffices, whereas the prior best bound, of $O(k/\epsilon2)$, only applied to the case $z = 2$ [CEMMP15]. $\bullet$ For $(k,z)$-flat approximation: we show $t = \tilde{O}(zk2/\epsilon3)$ suffices, completely removing the dependence on $n$ from the prior bound $\tilde{O}(zk2 \log n/\epsilon3)$ of [KR15]. $\bullet$ For $(k,z)$-line approximation: we show $t = O((k \log \log n + z + \log(1/\epsilon)) / \epsilon3)$ suffices, and ours is the first to give any dimension reduction result.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.