Code Sparsification and its Applications (2311.00788v1)
Abstract: We introduce a notion of code sparsification that generalizes the notion of cut sparsification in graphs. For a (linear) code $\mathcal{C} \subseteq \mathbb{F}_qn$ of dimension $k$ a $(1 \pm \epsilon)$-sparsification of size $s$ is given by a weighted set $S \subseteq [n]$ with $|S| \leq s$ such that for every codeword $c \in \mathcal{C}$ the projection $c|_S$ of $c$ to the set $S$ has (weighted) hamming weight which is a $(1 \pm \epsilon)$ approximation of the hamming weight of $c$. We show that for every code there exists a $(1 \pm \epsilon)$-sparsification of size $s = \widetilde{O}(k \log (q) / \epsilon2)$. This immediately implies known results on graph and hypergraph cut sparsification up to polylogarithmic factors (with a simple unified proof). One application of our result is near-linear size sparsifiers for constraint satisfaction problems (CSPs) over $\mathbb{F}_p$-valued variables whose unsatisfying assignments can be expressed as the zeros of a linear equation modulo a prime $p$. Building on this, we obtain a complete characterization of ternary Boolean CSPs that admit near-linear size sparsification. Finally, by connections between the eigenvalues of the Laplacians of Cayley graphs over $\mathbb{F}_2k$ to the weights of codewords, we also give the first proof of the existence of spectral Cayley graph sparsifiers over $\mathbb{F}_2k$ by Cayley graphs, i.e., where we sparsify the set of generators to nearly-optimal size.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.