Statistics – Machine Learning
Scientific paper
2008-10-06
Statistics
Machine Learning
34 pages, 6 figures, technical report (submitted)
Scientific paper
Many problems of low-level computer vision and image processing, such as denoising, deconvolution, tomographic reconstruction or super-resolution, can be addressed by maximizing the posterior distribution of a sparse linear model (SLM). We show how higher-order Bayesian decision-making problems, such as optimizing image acquisition in magnetic resonance scanners, can be addressed by querying the SLM posterior covariance, unrelated to the density's mode. We propose a scalable algorithmic framework, with which SLM posteriors over full, high-resolution images can be approximated for the first time, solving a variational optimization problem which is convex iff posterior mode finding is convex. These methods successfully drive the optimization of sampling trajectories for real-world magnetic resonance imaging through Bayesian experimental design, which has not been attempted before. Our methodology provides new insight into similarities and differences between sparse reconstruction and approximate Bayesian inference, and has important implications for compressive sensing of real-world images.
Nickisch Hannes
Seeger Matthias W.
No associations
LandOfFree
Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-205473