Many network information theory problems face the similar difficulty of
single letterization. We argue that this is due to the lack of a geometric
structure on the space of probability distribution. In this paper, we develop
such a structure by assuming that the distributions of interest are close to
each other. Under this assumption, the K-L divergence is reduced to the squared
Euclidean metric in an Euclidean space. Moreover, we construct the notion of
coordinate and inner product, which will facilitate solving communication
problems. We will also present the application of this approach to the
point-to-point channel and the general broadcast channel, which demonstrates
how our technique simplifies information theory problems.Comment: To appear, IEEE International Symposium on Information Theory, July,
201