Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 89 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 15 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 90 tok/s Pro
Kimi K2 211 tok/s Pro
GPT OSS 120B 459 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Towards a Definitive Compressibility Measure for Repetitive Sequences (1910.02151v3)

Published 4 Oct 2019 in cs.DS

Abstract: Unlike in statistical compression, where Shannon's entropy is a definitive lower bound, no such clear measure exists for the compressibility of repetitive sequences. Since statistical entropy does not capture repetitiveness, ad-hoc measures like the size $z$ of the Lempel--Ziv parse are frequently used to estimate it. The size $b \le z$ of the smallest bidirectional macro scheme captures better what can be achieved via copy-paste processes, though it is NP-complete to compute and it is not monotonic upon symbol appends. Recently, a more principled measure, the size $\gamma$ of the smallest string \emph{attractor}, was introduced. The measure $\gamma \le b$ lower bounds all the previous relevant ones, yet length-$n$ strings can be represented and efficiently indexed within space $O(\gamma\log\frac{n}{\gamma})$, which also upper bounds most measures. While $\gamma$ is certainly a better measure of repetitiveness than $b$, it is also NP-complete to compute and not monotonic, and it is unknown if one can always represent a string in $o(\gamma\log n)$ space. In this paper, we study an even smaller measure, $\delta \le \gamma$, which can be computed in linear time, is monotonic, and allows encoding every string in $O(\delta\log\frac{n}{\delta})$ space because $z = O(\delta\log\frac{n}{\delta})$. We show that $\delta$ better captures the compressibility of repetitive strings. Concretely, we show that (1) $\delta$ can be strictly smaller than $\gamma$, by up to a logarithmic factor; (2) there are string families needing $\Omega(\delta\log\frac{n}{\delta})$ space to be encoded, so this space is optimal for every $n$ and $\delta$; (3) one can build run-length context-free grammars of size $O(\delta\log\frac{n}{\delta})$, whereas the smallest (non-run-length) grammar can be up to $\Theta(\log n/\log\log n)$ times larger; and (4) within $O(\delta\log\frac{n}{\delta})$ space we can not only...

Citations (52)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.