The Boltzmann/Shannon entropy as a measure of correlation

Physics – Mathematical Physics

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

IIt is demonstrated that the entropy of statistical mechanics and of information theory, $S({\bf p}) = -\sum p_i \log p_i $ may be viewed as a measure of correlation. Given a probability distribution on two discrete variables, $p_{ij}$, we define the correlation-destroying transformation $C: p_{ij} \to \pi_{ij}$, which creates a new distribution on those same variables in which no correlation exists between the variables, i.e. $\pi_{ij} = P_i Q_j$. It is then shown that the entropy obeys the relation $S({\bf p}) \leq S({\bf \pi}) = S({\bf P}) + S({\bf Q})$, i.e. the entropy is non-decreasing under these correlation-destroying transformations.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

The Boltzmann/Shannon entropy as a measure of correlation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with The Boltzmann/Shannon entropy as a measure of correlation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and The Boltzmann/Shannon entropy as a measure of correlation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-266772

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.