Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences

Mathematics – Probability

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

To appear in: Journal of Concrete and Applicable Mathematics, 2005

Scientific paper

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber relative information and Jeffreys J-divergence. The measures like, Bhattacharya distance, Hellinger discrimination, Chi-square divergence, triangular discrimination and harmonic mean divergence are also famous in the literature on statistics. In this paper we have obtained bounds on triangular discrimination and symmetric chi-square divergence in terms of relative information of type s using Csiszar's f-divergence. A relationship among triangular discrimination and harmonic mean divergence is also given.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-163079

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.