Nonextensive Generalizations of the Jensen-Shannon Divergence

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Submitted to the IEEE Transactions on Information Theory

Scientific paper

Convexity is a key concept in information theory, namely via the many implications of Jensen's inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen's inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building blocks: convexity and Shannon's entropy. In particular, a new concept of q-convexity is introduced and shown to satisfy a Jensen's q-inequality. Based on this Jensen's q-inequality, the Jensen-Tsallis q-difference is built, which is a nonextensive generalization of the JSD, based on Tsallis entropies. Finally, the Jensen-Tsallis q-difference is charaterized in terms of convexity and extrema.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Nonextensive Generalizations of the Jensen-Shannon Divergence does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Nonextensive Generalizations of the Jensen-Shannon Divergence, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Nonextensive Generalizations of the Jensen-Shannon Divergence will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-355791

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.