research

An Exploration of the Role of Principal Inertia Components in Information Theory

Abstract

The principal inertia components of the joint distribution of two random variables XX and YY are inherently connected to how an observation of YY is statistically related to a hidden variable XX. In this paper, we explore this connection within an information theoretic framework. We show that, under certain symmetry conditions, the principal inertia components play an important role in estimating one-bit functions of XX, namely f(X)f(X), given an observation of YY. In particular, the principal inertia components bear an interpretation as filter coefficients in the linear transformation of pf(X)Xp_{f(X)|X} into pf(X)Yp_{f(X)|Y}. This interpretation naturally leads to the conjecture that the mutual information between f(X)f(X) and YY is maximized when all the principal inertia components have equal value. We also study the role of the principal inertia components in the Markov chain BXYB^B\rightarrow X\rightarrow Y\rightarrow \widehat{B}, where BB and B^\widehat{B} are binary random variables. We illustrate our results for the setting where XX and YY are binary strings and YY is the result of sending XX through an additive noise binary channel.Comment: Submitted to the 2014 IEEE Information Theory Workshop (ITW

    Similar works

    Full text

    thumbnail-image

    Available Versions