research

Mutual information and conditional mean prediction error

Abstract

This version: arXiv:1407.7165v1. Available from arXiv.org via the link in this recordMutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understood. Here we explore new connections between mutual information and regression-based dependence measures, ν−1\nu^{-1}, that utilise the determinant of the second-moment matrix of the conditional mean prediction error. We examine convergence properties as ν→0\nu\rightarrow0 and establish sharp lower bounds on mutual information and capacity of the form log(ν−1/2)\mathrm{log}(\nu^{-1/2}). The bounds are tighter than lower bounds based on the Pearson correlation and ones derived using average mean square-error rate distortion arguments. Furthermore, their estimation is feasible using techniques from nonparametric regression. As an illustration we provide bootstrap confidence intervals for the lower bounds which, through use of a composite estimator, substantially improve upon inference about mutual information based on kk-nearest neighbour estimators alone

    Similar works