Statistics – Machine Learning
Scientific paper
2012-04-19
Statistics
Machine Learning
33 pages, 10 figures
Scientific paper
In this paper we propose a novel framework for the construction of sparsity-inducing priors. In particular, we define such priors as a mixture of exponential power distributions with a generalized inverse Gaussian density (EP-GIG). EP-GIG is a variant of generalized hyperbolic distributions, and the special cases include Gaussian scale mixtures and Laplace scale mixtures. Furthermore, Laplace scale mixtures can subserve a Bayesian framework for sparse learning with nonconvex penalization. The densities of EP-GIG can be explicitly expressed. Moreover, the corresponding posterior distribution also follows a generalized inverse Gaussian distribution. These properties lead us to EM algorithms for Bayesian sparse learning. We show that these algorithms bear an interesting resemblance to iteratively re-weighted $\ell_2$ or $\ell_1$ methods. In addition, we present two extensions for grouped variable selection and logistic regression.
Jordan Michael I.
Liu Dehua
Wang Shusen
Zhang Zhihua
No associations
LandOfFree
EP-GIG Priors and Applications in Bayesian Sparse Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with EP-GIG Priors and Applications in Bayesian Sparse Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and EP-GIG Priors and Applications in Bayesian Sparse Learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-33862