research

On Probability Estimation by Exponential Smoothing

Abstract

Probability estimation is essential for every statistical data compression algorithm. In practice probability estimation should be adaptive, recent observations should receive a higher weight than older observations. We present a probability estimation method based on exponential smoothing that satisfies this requirement and runs in constant time per letter. Our main contribution is a theoretical analysis in case of a binary alphabet for various smoothing rate sequences: We show that the redundancy w.r.t. a piecewise stationary model with ss segments is O(sn)O\left(s\sqrt n\right) for any bit sequence of length nn, an improvement over redundancy O(snlogn)O\left(s\sqrt{n\log n}\right) of previous approaches with similar time complexity

    Similar works

    Full text

    thumbnail-image

    Available Versions