Mathematics – Statistics Theory
Scientific paper
2011-02-07
Mathematics
Statistics Theory
Was previously entitled "Compressible priors for high-dimensional statistics"; IEEE Transactions on Information Theory (2012)
Scientific paper
10.1109/TIT.2012.2197174
We develop a principled way of identifying probability distributions whose independent and identically distributed (iid) realizations are compressible, i.e., can be well-approximated as sparse. We focus on Gaussian random underdetermined linear regression (GULR) problems, where compressibility is known to ensure the success of estimators exploiting sparse regularization. We prove that many distributions revolving around maximum a posteriori (MAP) interpretation of sparse regularized estimators are in fact incompressible, in the limit of large problem sizes. A highlight is the Laplace distribution and $\ell^{1}$ regularized estimators such as the Lasso and Basis Pursuit denoising. To establish this result, we identify non-trivial undersampling regions in GULR where the simple least squares solution almost surely outperforms an oracle sparse solution, when the data is generated from the Laplace distribution. We provide simple rules of thumb to characterize classes of compressible (respectively incompressible) distributions based on their second and fourth moments. Generalized Gaussians and generalized Pareto distributions serve as running examples for concreteness.
Cevher Volkan
Davies Mike E.
Gribonval Rémi
No associations
LandOfFree
Compressible Distributions for High-dimensional Statistics does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Compressible Distributions for High-dimensional Statistics, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Compressible Distributions for High-dimensional Statistics will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-680962