Mathematics – Probability
Scientific paper
2005-05-12
Mathematics
Probability
To appear in: Journal of Concrete and Applicable Mathematics, 2005
Scientific paper
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber relative information and Jeffreys J-divergence. The measures like, Bhattacharya distance, Hellinger discrimination, Chi-square divergence, triangular discrimination and harmonic mean divergence are also famous in the literature on statistics. In this paper we have obtained bounds on triangular discrimination and symmetric chi-square divergence in terms of relative information of type s using Csiszar's f-divergence. A relationship among triangular discrimination and harmonic mean divergence is also given.
No associations
LandOfFree
Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-163079