Mathematics – Statistics Theory
Scientific paper
2010-03-26
Mathematics
Statistics Theory
Scientific paper
Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-dimensional space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $(\hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n)$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Biau Gérard
Mas André
No associations
LandOfFree
PCA-Kernel Estimation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with PCA-Kernel Estimation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and PCA-Kernel Estimation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-182209