Statistics – Methodology
Scientific paper
2009-07-27
Statistics
Methodology
Scientific paper
We address covariance estimation in the sense of minimum mean-squared error (MMSE) for Gaussian samples. Specifically, we consider shrinkage methods which are suitable for high dimensional problems with a small number of samples (large p small n). First, we improve on the Ledoit-Wolf (LW) method by conditioning on a sufficient statistic. By the Rao-Blackwell theorem, this yields a new estimator called RBLW, whose mean-squared error dominates that of LW for Gaussian variables. Second, to further reduce the estimation error, we propose an iterative approach which approximates the clairvoyant shrinkage estimator. Convergence of this iterative method is established and a closed form expression for the limit is determined, which is referred to as the oracle approximating shrinkage (OAS) estimator. Both RBLW and OAS estimators have simple expressions and are easily implemented. Although the two methods are developed from different persepctives, their structure is identical up to specified constants. The RBLW estimator provably dominates the LW method. Numerical simulations demonstrate that the OAS approach can perform even better than RBLW, especially when n is much less than p. We also demonstrate the performance of these techniques in the context of adaptive beamforming.
Chen Yilun
Eldar Yonina C.
Hero III Alfred O.
Wiesel Ami
No associations
LandOfFree
Shrinkage Algorithms for MMSE Covariance Estimation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Shrinkage Algorithms for MMSE Covariance Estimation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Shrinkage Algorithms for MMSE Covariance Estimation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-246470