Mathematics – Probability
Scientific paper
2005-05-11
Mathematics
Probability
29 pages. To appear in: Inequality Theory and Applications, Vol. 4(2004), 29 pages
Scientific paper
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber's (1951)relative information and Jeffreys (1946) J-divergence, Information radius or Jensen difference divergence measure due to Sibson (1969) also known in the literature. Burbea and Rao (1982) has also found its applications in the literature. Taneja (1995) studied another kind of divergence measure based on arithmetic and geometric means. These three divergence measures bear a good relationship among each other. But there are another measures arising due to J-divergence, JS-divergence and AG-divergence. These measures we call here relative divergence measures or non-symmetric divergence measures. Here our aim is to obtain bounds on symmetric and non-symmetric divergence measures in terms of relative information of type s using properties of Csiszar's f-divergence.
No associations
LandOfFree
Relative Divergence Measures and Information Inequalities does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Relative Divergence Measures and Information Inequalities, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Relative Divergence Measures and Information Inequalities will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-73517