Computer Science – Information Theory
Scientific paper
2010-10-03
Computer Science
Information Theory
Scientific paper
There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Sibson-Burbea-Rao Jensen-Shannon divegernce and Taneja arithemtic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination are also known in the literature and are not based on logarithmic expressions. Past years Dragomir et al., Kumar and Johnson and Jain and Srivastava studied different kind of divergence measures. In this paper, we have presented some more new divergence measures and obtained inequalities relating these new measures and also made connections with previous ones. The idea of exponential divergence is also introduced.
No associations
LandOfFree
Sequences of Inequalities Among New Divergence Measures does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Sequences of Inequalities Among New Divergence Measures, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sequences of Inequalities Among New Divergence Measures will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-512475