544 research outputs found

    Two Measures of Dependence

    Full text link
    Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order α\alpha and the relative α\alpha-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α\alpha is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.Comment: 40 pages; 1 figure; published in Entrop
    • …
    corecore