Computer Science – Information Theory
Scientific paper
2011-04-29
Computer Science
Information Theory
Scientific paper
In this paper we have considered two one parametric generalizations. These two generalizations have in articular the well known measures such as: J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These three measures are also well-known in the literature of statistics, and are without logarithmic expressions. Still, we have one more non logarithmic measure as particular case calling it d-divergence. These seven measures bear an interesting inequality. Based on this inequality, we have considered different difference of divergence measures and established a sequence of inequalities among themselves.
No associations
LandOfFree
A Sequence of Inequalities among Difference of Symmetric Divergence Measures does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A Sequence of Inequalities among Difference of Symmetric Divergence Measures, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Sequence of Inequalities among Difference of Symmetric Divergence Measures will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-18220