Physics – Data Analysis – Statistics and Probability
Scientific paper
2010-08-04
Phys. Rev. E 83, 010101(R) (2011)
Physics
Data Analysis, Statistics and Probability
4 pages
Scientific paper
We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf 69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametrizations, because linear correlations are not invariant under them. The simplest bounds are obtained for Gaussians, but the most interesting ones for practical purposes are obtained for uniform marginal distributions. The latter can be enforced in general by using the ranks of the individual variables instead of their actual values, in which case one obtains bounds on MI in terms of Spearman correlation coefficients. We show with gene expression data that these bounds are in general non-trivial, and the degree of their (non-)saturation yields valuable insight.
Foster David V.
Grassberger Peter
No associations
LandOfFree
Lower Bounds on Mutual Information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Lower Bounds on Mutual Information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Lower Bounds on Mutual Information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-83133