On Probability Estimation by Exponential Smoothing (1501.01202v2)
Abstract: Probability estimation is essential for every statistical data compression algorithm. In practice probability estimation should be adaptive, recent observations should receive a higher weight than older observations. We present a probability estimation method based on exponential smoothing that satisfies this requirement and runs in constant time per letter. Our main contribution is a theoretical analysis in case of a binary alphabet for various smoothing rate sequences: We show that the redundancy w.r.t. a piecewise stationary model with $s$ segments is $O\left(s\sqrt n\right)$ for any bit sequence of length $n$, an improvement over redundancy $O\left(s\sqrt{n\log n}\right)$ of previous approaches with similar time complexity.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.