Emergent Mind

Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

(1602.05161)
Published Feb 16, 2016 in cs.LG , cs.CC , and cs.CR

Abstract

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string $x \in {0,1}n$ was chosen uniformly at random. A learner tries to learn $x$ from a stream of samples $(a1, b1), (a2, b2) \ldots$, where each~$at$ is uniformly distributed over ${0,1}n$ and $bt$ is the inner product of $a_t$ and $x$, modulo~2. We show that any algorithm for parity learning, that uses less than $\frac{n2}{25}$ bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is $O(n)$ (where $n$ is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length $n$, as well as time complexity of $n$ per encryption/decription of each bit, and is provenly and unconditionally secure as long as the attacker uses less than $\frac{n2}{25}$ memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decription.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.