Computer Science – Learning
Scientific paper
2012-02-06
Computer Science
Learning
Submitted to the IEEE Transactions on Signal Processing, 30 pages, 13 figures
Scientific paper
In this paper, we derive Hybrid, Bayesian and Marginalized Cramer Rao Lower Bounds (HCRB, BCRB and MCRB) for the single measurement vector and multiple measurement vector Sparse Bayesian Learning (SBL) problem of estimating compressible vectors and their prior distribution parameters. We assume the unknown vector to be drawn from a compressible Student-t prior distribution. We derive CRBs that encompass the deterministic or random nature of the unknown parameters of the prior distribution and the regression noise variance. We extend the MCRB to the case where the compressible vector is distributed according to a general compressible prior distribution, of which, the generalized Pareto distribution is a special case. We use the derived bounds to uncover the relationship between the compressibility and Mean Square Error (MSE) in the estimates. Further, we illustrate the tightness and utility of the bounds through simulations, by comparing them with the MSE performance of two popular SBL-based estimators. It is found that the MCRB is generally the tightest among the bounds derived and that the MSE performance of the Expectation-Maximization (EM) algorithm coincides with the MCRB for the compressible vector. Through simulations, we demonstrate the dependence of the lower bounds as well as the MSE performance of SBL based estimators on the compressibility of the vector for several values of the number of observations, the number of measurements, and at different signal powers.
Murthy Chandra R.
Prasad Ranjitha
No associations
LandOfFree
Cramer Rao-Type Bounds for Sparse Bayesian Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Cramer Rao-Type Bounds for Sparse Bayesian Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cramer Rao-Type Bounds for Sparse Bayesian Learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-119392