slides

Universal compression of Gaussian sources with unknown parameters

Abstract

For a collection of distributions over a countable support set, the worst case universal compression formulation by Shtarkov attempts to assign a universal distribution over the support set. The formulation aims to ensure that the universal distribution does not underestimate the probability of any element in the support set relative to distributions in the collection. When the alphabet is uncountable and we have a collection P\cal P of Lebesgue continuous measures instead, we ask if there is a corresponding universal probability density function (pdf) that does not underestimate the value of the density function at any point in the support relative to pdfs in P\cal P. Analogous to the worst case redundancy of a collection of distributions over a countable alphabet, we define the \textit{attenuation} of a class to be AA when the worst case optimal universal pdf at any point xx in the support is always at least the value any pdf in the collection P\cal P assigns to xx divided by AA. We analyze the attenuation of the worst optimal universal pdf over length-nn samples generated \textit{i.i.d.} from a Gaussian distribution whose mean can be anywhere between βˆ’Ξ±/2-\alpha/2 to Ξ±/2\alpha/2 and variance between Οƒm2\sigma_m^2 and ΟƒM2\sigma_M^2. We show that this attenuation is finite, grows with the number of samples as O(n){\cal O}(n), and also specify the attentuation exactly without approximations. When only one parameter is allowed to vary, we show that the attenuation grows as O(n){\cal O}(\sqrt{n}), again keeping in line with results from prior literature that fix the order of magnitude as a factor of n\sqrt{n} per parameter. In addition, we also specify the attenuation exactly without approximation when only the mean or only the variance is allowed to vary

    Similar works

    Full text

    thumbnail-image

    Available Versions