288 research outputs found
On empirical cumulant generating functions of code lengths for individual sequences
We consider the problem of lossless compression of individual sequences using
finite-state (FS) machines, from the perspective of the best achievable
empirical cumulant generating function (CGF) of the code length, i.e., the
normalized logarithm of the empirical average of the exponentiated code length.
Since the probabilistic CGF is minimized in terms of the R\'enyi entropy of the
source, one of the motivations of this study is to derive an
individual-sequence analogue of the R\'enyi entropy, in the same way that the
FS compressibility is the individual-sequence counterpart of the Shannon
entropy. We consider the CGF of the code-length both from the perspective of
fixed-to-variable (F-V) length coding and the perspective of
variable-to-variable (V-V) length coding, where the latter turns out to yield a
better result, that coincides with the FS compressibility. We also extend our
results to compression with side information, available at both the encoder and
decoder. In this case, the V-V version no longer coincides with the FS
compressibility, but results in a different complexity measure.Comment: 15 pages; submitted for publicatio
Optimum estimation via gradients of partition functions and information measures: a statistical-mechanical perspective
In continuation to a recent work on the statistical--mechanical analysis of
minimum mean square error (MMSE) estimation in Gaussian noise via its relation
to the mutual information (the I-MMSE relation), here we propose a simple and
more direct relationship between optimum estimation and certain information
measures (e.g., the information density and the Fisher information), which can
be viewed as partition functions and hence are amenable to analysis using
statistical--mechanical techniques. The proposed approach has several
advantages, most notably, its applicability to general sources and channels, as
opposed to the I-MMSE relation and its variants which hold only for certain
classes of channels (e.g., additive white Gaussian noise channels). We then
demonstrate the derivation of the conditional mean estimator and the MMSE in a
few examples. Two of these examples turn out to be generalizable to a fairly
wide class of sources and channels. For this class, the proposed approach is
shown to yield an approximate conditional mean estimator and an MMSE formula
that has the flavor of a single-letter expression. We also show how our
approach can easily be generalized to situations of mismatched estimation.Comment: 21 pages; submitted to the IEEE Transactions on Information Theor
- …