The interactions between three or more random variables are often nontrivial,
poorly understood, and yet, are paramount for future advances in fields such as
network information theory, neuroscience, genetics and many others. In this
work, we propose to analyze these interactions as different modes of
information sharing. Towards this end, we introduce a novel axiomatic framework
for decomposing the joint entropy, which characterizes the various ways in
which random variables can share information. The key contribution of our
framework is to distinguish between interdependencies where the information is
shared redundantly, and synergistic interdependencies where the sharing
structure exists in the whole but not between the parts. We show that our
axioms determine unique formulas for all the terms of the proposed
decomposition for a number of cases of interest. Moreover, we show how these
results can be applied to several network information theory problems,
providing a more intuitive understanding of their fundamental limits.Comment: 39 pages, 4 figure