research

Relations Between Conditional Shannon Entropy and Expectation of α\ell_{\alpha}-Norm

Abstract

The paper examines relationships between the conditional Shannon entropy and the expectation of α\ell_{\alpha}-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of α\ell_{\alpha}-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several information measures which are determined by the expectation of α\ell_{\alpha}-norm, e.g., the conditional R\'{e}nyi entropy and the conditional RR-norm information. Moreover, we apply these results to discrete memoryless channels under a uniform input distribution. Then, we show the tight bounds of Gallager's E0E_{0} functions with a fixed mutual information under a uniform input distribution.Comment: a short version was submitted to ISIT'201

    Similar works

    Full text

    thumbnail-image

    Available Versions