EP-GIG Priors and Applications in Bayesian Sparse Learning

Statistics – Machine Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

33 pages, 10 figures

Scientific paper

In this paper we propose a novel framework for the construction of sparsity-inducing priors. In particular, we define such priors as a mixture of exponential power distributions with a generalized inverse Gaussian density (EP-GIG). EP-GIG is a variant of generalized hyperbolic distributions, and the special cases include Gaussian scale mixtures and Laplace scale mixtures. Furthermore, Laplace scale mixtures can subserve a Bayesian framework for sparse learning with nonconvex penalization. The densities of EP-GIG can be explicitly expressed. Moreover, the corresponding posterior distribution also follows a generalized inverse Gaussian distribution. These properties lead us to EM algorithms for Bayesian sparse learning. We show that these algorithms bear an interesting resemblance to iteratively re-weighted $\ell_2$ or $\ell_1$ methods. In addition, we present two extensions for grouped variable selection and logistic regression.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

EP-GIG Priors and Applications in Bayesian Sparse Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with EP-GIG Priors and Applications in Bayesian Sparse Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and EP-GIG Priors and Applications in Bayesian Sparse Learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-33862

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.