544 research outputs found
Two Measures of Dependence
Two families of dependence measures between random variables are introduced.
They are based on the R\'enyi divergence of order and the relative
-entropy, respectively, and both dependence measures reduce to
Shannon's mutual information when their order is one. The first
measure shares many properties with the mutual information, including the
data-processing inequality, and can be related to the optimal error exponents
in composite hypothesis testing. The second measure does not satisfy the
data-processing inequality, but appears naturally in the context of distributed
task encoding.Comment: 40 pages; 1 figure; published in Entrop
- …