PCA-Kernel Estimation

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-dimensional space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $(\hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n)$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

PCA-Kernel Estimation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with PCA-Kernel Estimation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and PCA-Kernel Estimation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-182209

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.