Emergent Mind

Biased halfspaces, noise sensitivity, and local Chernoff inequalities

(1710.07429)
Published Oct 20, 2017 in math.CO , cs.CC , cs.DM , and math.PR

Abstract

A halfspace is a function $f\colon{-1,1}n \rightarrow {0,1}$ of the form $f(x)=\mathbb{1}(a\cdot x>t)$, where $\sumi ai2=1$. We show that if $f$ is a halfspace with $\mathbb{E}[f]=\epsilon$ and $a'=\maxi |ai|$, then the degree-1 Fourier weight of $f$ is $W1(f)=\Theta(\epsilon2 \log(1/\epsilon))$, and the maximal influence of $f$ is $I{\max}(f)=\Theta(\epsilon \min(1,a' \sqrt{\log(1/\epsilon)}))$. These results, which determine the exact asymptotic order of $W1(f)$ and $I{\max}(f)$, provide sharp generalizations of theorems proved by Matulef, O'Donnell, Rubinfeld, and Servedio, and settle a conjecture posed by Kalai, Keller and Mossel. In addition, we present a refinement of the definition of noise sensitivity which takes into consideration the bias of the function, and show that (like in the unbiased case) halfspaces are noise resistant, and, in the other direction, any noise resistant function is well correlated with a halfspace. Our main tools are 'local' forms of the classical Chernoff inequality, like the following one proved by Devroye and Lugosi (2008): Let ${ xi }$ be independent random variables uniformly distributed in ${-1,1}$, and let $ai\in\mathbb{R}+$ be such that $\sumi a{i}{2}=1$. If for some $t\geq 0$ we have $\Pr[\sum{i} ai xi > t]=\epsilon$, then $\Pr[\sum{i} ai x_i>t+\delta]\leq \frac{\epsilon}{2}$ holds for $\delta\leq c/\sqrt{\log(1/\epsilon)}$, where $c$ is a universal constant.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.