We correct claims about lower bounds on mutual information (MI) between
real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf
69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of
linear correlations depend on the marginal (single variable) distributions.
This is so in spite of the invariance of MI under reparametrizations, because
linear correlations are not invariant under them. The simplest bounds are
obtained for Gaussians, but the most interesting ones for practical purposes
are obtained for uniform marginal distributions. The latter can be enforced in
general by using the ranks of the individual variables instead of their actual
values, in which case one obtains bounds on MI in terms of Spearman correlation
coefficients. We show with gene expression data that these bounds are in
general non-trivial, and the degree of their (non-)saturation yields valuable
insight.Comment: 4 page