Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

Physics – Condensed Matter – Statistical Mechanics

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

16 pages. Iterative corrections and expansions

Scientific paper

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information-principle using the dual generalized K-Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K-Ld, with normal averages constraints, is shown to exhibit distinctly unique features.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Deformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-52378

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.