research

Entropy of Highly Correlated Quantized Data

Abstract

This paper considers the entropy of highly correlated quantized samples. Two results are shown. The first concerns sampling and identically scalar quantizing a stationary continuous-time random process over a finite interval. It is shown that if the process crosses a quantization threshold with positive probability, then the joint entropy of the quantized samples tends to infinity as the sampling rate goes to infinity. The second result provides an upper bound to the rate at which the joint entropy tends to infinity, in the case of an infinite-level uniform threshold scalar quantizer and a stationary Gaussian random process. Specifically, an asymptotic formula for the conditional entropy of one quantized sample conditioned on the previous quantized sample is derived. At high sampling rates, these results indicate a sharp contrast between the large encoding rate (in bits/sec) required by a lossy source code consisting of a fixed scalar quantizer and an ideal, sampling-rate-adapted lossless code, and the bounded encoding rate required by an ideal lossy source code operating at the same distortion

    Similar works