Emergent Mind

Tight Static Lower Bounds for Non-Adaptive Data Structures

(2001.05053)
Published Jan 14, 2020 in cs.DS

Abstract

In this paper, we study the static cell probe complexity of non-adaptive data structures that maintain a subset of $n$ points from a universe consisting of $m=n{1+\Omega(1)}$ points. A data structure is defined to be non-adaptive when the memory locations that are chosen to be accessed during a query depend only on the query inputs and not on the contents of memory. We prove an $\Omega(\log m / \log (sw/n\log m))$ static cell probe complexity lower bound for non-adaptive data structures that solve the fundamental dictionary problem where $s$ denotes the space of the data structure in the number of cells and $w$ is the cell size in bits. Our lower bounds hold for all word sizes including the bit probe model ($w = 1$) and are matched by the upper bounds of Boninger et al. [FSTTCS'17]. Our results imply a sharp dichotomy between dictionary data structures with one round of adaptive and at least two rounds of adaptivity. We show that $O(1)$, or $O(\log{1-\epsilon}(m))$, overhead dictionary constructions are only achievable with at least two rounds of adaptivity. In particular, we show that many $O(1)$ dictionary constructions with two rounds of adaptivity such as cuckoo hashing are optimal in terms of adaptivity. On the other hand, non-adaptive dictionaries must use significantly more overhead. Finally, our results also imply static lower bounds for the non-adaptive predecessor problem. Our static lower bounds peak higher than the previous, best known lower bounds of $\Omega(\log m / \log w)$ for the dynamic predecessor problem by Boninger et al. [FSTTCS'17] and Ramamoorthy and Rao [CCC'18] in the natural setting of linear space $s = \Theta(n)$ where each point can fit in a single cell $w = \Theta(\log m)$. Furthermore, our results are stronger as they apply to the static setting unlike the previous lower bounds that only applied in the dynamic setting.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.