Statistics – Methodology
Scientific paper
2011-05-28
Statistics
Methodology
Scientific paper
It is now practically the norm for data to be very high dimensional in areas such as genetics, machine vision, image analysis and many others. When analyzing such data, parametric models are often too inflexible while nonparametric procedures tend to be non-robust because of insufficient data on these high dimensional spaces. It is often the case with high-dimensional data that most of the variability tends to be along a few directions, or more generally along a much smaller dimensional submanifold of the data space. In this article, we propose a class of models that flexibly learn about this submanifold and its dimension which simultaneously performs dimension reduction. As a result, density estimation is carried out efficiently. When performing classification with a large predictor space, our approach allows the category probabilities to vary nonparametrically with a few features expressed as linear combinations of the predictors. As opposed to many black-box methods for dimensionality reduction, the proposed model is appealing in having clearly interpretable and identifiable parameters. Gibbs sampling methods are developed for posterior computation, and the methods are illustrated in simulated and real data applications.
Bhattacharya Abhishek
Dunson David
Page Garritt
No associations
LandOfFree
Density Estimation and Classification via Bayesian Nonparametric Learning of Affine Subspaces does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Density Estimation and Classification via Bayesian Nonparametric Learning of Affine Subspaces, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Density Estimation and Classification via Bayesian Nonparametric Learning of Affine Subspaces will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-289457