The ``Gibbs Paradox'' refers to several related questions concerning entropy
in thermodynamics and statistical mechanics: whether it is an extensive
quantity or not, how it changes when identical particles are mixed, and the
proper way to count states in systems of identical particles. Several authors
have recognized that the paradox is resolved once it is realized that there is
no such thing as the entropy of a system, that there are many entropies, and
that the choice between treating particles as being distinguishable or not
depends on the resolution of the experiment. The purpose of this note is
essentially pedagogical; we add to their analysis by examining the paradox from
the point of view of information theory. Our argument is based on that
`grouping' property of entropy that Shannon recognized, by including it among
his axioms, as an essential requirement on any measure of information. Not only
does it provide the right connection between different entropies but, in
addition, it draws our attention to the obvious fact that addressing issues of
distinguishability and of counting states requires a clear idea about what
precisely do we mean by a state.Comment: Presented at MaxEnt 2001, the 21th International Workshop on Bayesian
Inference and Maximum Entropy Methods (August 4-9, 2001, Baltimore, MD, USA