Computer Science – Information Theory
Scientific paper
2008-04-10
Computer Science
Information Theory
Submitted to the IEEE Transactions on Information Theory
Scientific paper
Convexity is a key concept in information theory, namely via the many implications of Jensen's inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen's inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building blocks: convexity and Shannon's entropy. In particular, a new concept of q-convexity is introduced and shown to satisfy a Jensen's q-inequality. Based on this Jensen's q-inequality, the Jensen-Tsallis q-difference is built, which is a nonextensive generalization of the JSD, based on Tsallis entropies. Finally, the Jensen-Tsallis q-difference is charaterized in terms of convexity and extrema.
Aguiar Pedro
Figueiredo Mario
Martins Andre
No associations
LandOfFree
Nonextensive Generalizations of the Jensen-Shannon Divergence does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Nonextensive Generalizations of the Jensen-Shannon Divergence, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Nonextensive Generalizations of the Jensen-Shannon Divergence will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-355791