Emergent Mind

Fourier sparsity, spectral norm, and the Log-rank conjecture

(1304.1245)
Published Apr 4, 2013 in cs.CC

Abstract

We study Boolean functions with sparse Fourier coefficients or small spectral norm, and show their applications to the Log-rank Conjecture for XOR functions f(x\oplus y) a fairly large class of functions including well studied ones such as Equality and Hamming Distance. The rank of the communication matrix Mf for such functions is exactly the Fourier sparsity of f. Let d be the F2-degree of f and DCC(f) stand for the deterministic communication complexity for f(x\oplus y). We show that 1. DCC(f) = O(2{d2/2} log{d-2} ||\hat f||1). In particular, the Log-rank conjecture holds for XOR functions with constant F2-degree. 2. DCC(f) = O(d ||\hat f||1) = O(\sqrt{rank(Mf)}\logrank(Mf)). We obtain our results through a degree-reduction protocol based on a variant of polynomial rank, and actually conjecture that its communication cost is already \log{O(1)}rank(Mf). The above bounds also hold for the parity decision tree complexity of f, a measure that is no less than the communication complexity (up to a factor of 2). Along the way we also show several structural results about Boolean functions with small F2-degree or small spectral norm, which could be of independent interest. For functions f with constant F2-degree: 1) f can be written as the summation of quasi-polynomially many indicator functions of subspaces with \pm-signs, improving the previous doubly exponential upper bound by Green and Sanders; 2) being sparse in Fourier domain is polynomially equivalent to having a small parity decision tree complexity; 3) f depends only on polylog||\hat f||1 linear functions of input variables. For functions f with small spectral norm: 1) there is an affine subspace with co-dimension O(||\hat f||1) on which f is a constant; 2) there is a parity decision tree with depth O(||\hat f||1 log ||\hat f||0).

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.