Let N={1,⋯,n}. The entropy function h of a set of n
discrete random variables {Xi:i∈N} is a 2n-dimensional vector
whose entries are h(A)≜H(XA),A⊂N, the (joint) entropies of the subsets
of the set of n random variables with H(X∅)=0 by convention. The
set of all entropy functions for n discrete random variables, denoted by
Γn∗, is called the entropy function region for n. Characterization
of Γn∗ and its closure Γn∗ are well-known open
problems in information theory. They are important not only because they play
key roles in information theory problems but also they are related to other
subjects in mathematics and physics.
In this paper, we consider \emph{partition-symmetrical entropy functions}.
Let p={N1,⋯,Nt} be a t-partition of N. An
entropy function h is called p-symmetrical if for all A,B⊂N, h(A)=h(B) whenever ∣A∩Ni∣=∣B∩Ni∣, i=1,⋯,t. The set of
all the p-symmetrical entropy functions, denoted by Ψp∗, is called
p-symmetrical entropy function region. We prove that Ψp∗,
the closure of Ψp∗, is completely characterized by Shannon-type
information inequalities if and only if p is the 1-partition or a
2-partition with one of its blocks being a singleton.
The characterization of the partition-symmetrical entropy functions can be
useful for solving some information theory and related problems where symmetry
exists in the structure of the problems.
Keywords: entropy, entropy function, information inequality, polymatroid.Comment: This paper is published in IEEE Transactions on Information Theor