Emergent Mind

Abstract

Motivated by earlier results on universal randomized guessing, we consider an individual-sequence approach to the guessing problem: in this setting, the goal is to guess a secret, individual (deterministic) vector $xn=(x1,\ldots,xn)$, by using a finite-state machine that sequentially generates randomized guesses from a stream of purely random bits. We define the finite-state guessing exponent as the asymptotic normalized logarithm of the minimum achievable moment of the number of randomized guesses, generated by any finite-state machine, until $xn$ is guessed successfully. We show that the finite-state guessing exponent of any sequence is intimately related to its finite-state compressibility (due to Lempel and Ziv), and it is asymptotically achieved by the decoder of (a certain modified version of) the 1978 Lempel-Ziv data compression algorithm (a.k.a. the LZ78 algorithm), fed by purely random bits. The results are also extended to the case where the guessing machine has access to a side information sequence, $yn=(y1,\ldots,yn)$, which is also an individual sequence.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.