Mathematics – Statistics Theory
Scientific paper
2012-01-02
Mathematics
Statistics Theory
7 pages, 1 figure
Scientific paper
We introduce a new entropy based measure, the bounded Bhattacharyya distance (BBD), for quantifying the dissimilarity between probability distributions. BBD is based on the Bhattacharyya coefficient (fidelity), and is symmetric, positive semi-definite, and bounded. Unlike the Kullback-Leibler divergence, BBD does not require probability density functions to be absolutely continuous with respect to each other. We show that BBD belongs to the class of Csiszar f-divergence and derive certain relationships between BBD and well known measures such as Bhattacharyya, Hellinger and Jensen-Shannon divergence. Bounds on the Bayesian error probability are established with BBD measure. We show that the curvature of BBD in the parameter space of families of distributions is proportional to the Fisher information. For distributions with vector valued parameters, the curvature matrix can be used to obtain the Rao geodesic distance between them. We also discuss a potential application of probability distance measures in model selection.
Jolad Shivakumar
Roman Ahmed
Shastry Mahesh C.
No associations
LandOfFree
Bounded divergence measures based on Bhattacharyya coefficient does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Bounded divergence measures based on Bhattacharyya coefficient, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Bounded divergence measures based on Bhattacharyya coefficient will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-55258