The paper examines relationships between the conditional Shannon entropy and
the expectation of ℓα-norm for joint probability distributions.
More precisely, we investigate the tight bounds of the expectation of
ℓα-norm with a fixed conditional Shannon entropy, and vice versa.
As applications of the results, we derive the tight bounds between the
conditional Shannon entropy and several information measures which are
determined by the expectation of ℓα-norm, e.g., the conditional
R\'{e}nyi entropy and the conditional R-norm information. Moreover, we apply
these results to discrete memoryless channels under a uniform input
distribution. Then, we show the tight bounds of Gallager's E0 functions
with a fixed mutual information under a uniform input distribution.Comment: a short version was submitted to ISIT'201