Mathematics – Probability
Scientific paper
2005-06-13
Mathematics
Probability
20 pages
Scientific paper
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler (1951) relative information and Jeffreys (1951) J-divergence. Sibson (1969) Jensen-Shannon divergence has also found its applications in the literature. The author (1995) studied a new divergence measures based on arithmetic and geometric means. The measures like harmonic mean divergence and triangular discrimination are also known in the literature. Recently, Dragomir et al. (2001) also studies a new measure similar to J-divergence, we call here the relative J-divergence. Another measures arising due to Jensen-Shannon divergence is also studied by Lin (1991). Here we call it relative Jensen-Shannon divergence. Relative arithmetic-geometric divergence (ref. Taneja, 2004) is also studied here. All these measures can be written as particular cases of Csiszar's f-divergence. By putting some conditions on the probability distribution, the aim here is to obtain bounds among the measures.
No associations
LandOfFree
Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-225102