Finding the Maximizers of the Information Divergence from an Exponential Family

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

25 pages

Scientific paper

This paper investigates maximizers of the information divergence from an exponential family $E$. It is shown that the $rI$-projection of a maximizer $P$ to $E$ is a convex combination of $P$ and a probability measure $P_-$ with disjoint support and the same value of the sufficient statistics $A$. This observation can be used to transform the original problem of maximizing $D(\cdot||E)$ over the set of all probability measures into the maximization of a function $\Dbar$ over a convex subset of $\ker A$. The global maximizers of both problems correspond to each other. Furthermore, finding all local maximizers of $\Dbar$ yields all local maximizers of $D(\cdot||E)$. This paper also proposes two algorithms to find the maximizers of $\Dbar$ and applies them to two examples, where the maximizers of $D(\cdot||E)$ were not known before.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Finding the Maximizers of the Information Divergence from an Exponential Family does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Finding the Maximizers of the Information Divergence from an Exponential Family, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Finding the Maximizers of the Information Divergence from an Exponential Family will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-363507

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.