Emergent Mind

Abstract

The \emph{Chow parameters} of a Boolean function $f: {-1,1}n \to {-1,1}$ are its $n+1$ degree-0 and degree-1 Fourier coefficients. It has been known since 1961 (Chow, Tannenbaum) that the (exact values of the) Chow parameters of any linear threshold function $f$ uniquely specify $f$ within the space of all Boolean functions, but until recently (O'Donnell and Servedio) nothing was known about efficient algorithms for \emph{reconstructing} $f$ (exactly or approximately) from exact or approximate values of its Chow parameters. We refer to this reconstruction problem as the \emph{Chow Parameters Problem.} Our main result is a new algorithm for the Chow Parameters Problem which, given (sufficiently accurate approximations to) the Chow parameters of any linear threshold function $f$, runs in time $\tilde{O}(n2)\cdot (1/\eps){O(\log2(1/\eps))}$ and with high probability outputs a representation of an LTF $f'$ that is $\eps$-close to $f$. The only previous algorithm (O'Donnell and Servedio) had running time $\poly(n) \cdot 2{2{\tilde{O}(1/\eps2)}}.$ As a byproduct of our approach, we show that for any linear threshold function $f$ over ${-1,1}n$, there is a linear threshold function $f'$ which is $\eps$-close to $f$ and has all weights that are integers at most $\sqrt{n} \cdot (1/\eps){O(\log2(1/\eps))}$. This significantly improves the best previous result of Diakonikolas and Servedio which gave a $\poly(n) \cdot 2{\tilde{O}(1/\eps{2/3})}$ weight bound, and is close to the known lower bound of $\max{\sqrt{n},$ $(1/\eps){\Omega(\log \log (1/\eps))}}$ (Goldberg, Servedio). Our techniques also yield improved algorithms for related problems in learning theory.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.