Abstract — New upper and lower bounds are given for joint entropy of a collection of random variables, in both discrete and continuous settings. These bounds generalize well-known information theoretic inequalities due to Han. A number of applications are suggested, including a new bound on the number of independent sets of a graph that is of interest in discrete mathematics, and a bound on the number of zero-error codes. I
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.