Approximate Maximum A Posteriori Inference with Entropic Priors

Computer Science – Sound

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In certain applications it is useful to fit multinomial distributions to observed data with a penalty term that encourages sparsity. For example, in probabilistic latent audio source decomposition one may wish to encode the assumption that only a few latent sources are active at any given time. The standard heuristic of applying an L1 penalty is not an option when fitting the parameters to a multinomial distribution, which are constrained to sum to 1. An alternative is to use a penalty term that encourages low-entropy solutions, which corresponds to maximum a posteriori (MAP) parameter estimation with an entropic prior. The lack of conjugacy between the entropic prior and the multinomial distribution complicates this approach. In this report I propose a simple iterative algorithm for MAP estimation of multinomial distributions with sparsity-inducing entropic priors.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Approximate Maximum A Posteriori Inference with Entropic Priors does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Approximate Maximum A Posteriori Inference with Entropic Priors, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Approximate Maximum A Posteriori Inference with Entropic Priors will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-522368

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.