Statistics – Machine Learning
Scientific paper
2011-03-25
Statistics
Machine Learning
Scientific paper
The purpose of sufficient dimension reduction (SDR) is to find the low-dimensional subspace of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.
Niu Gang
Sugiyama Masashi
Takagi Jun
Yamada Makoto
No associations
LandOfFree
Sufficient Component Analysis for Supervised Dimension Reduction does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Sufficient Component Analysis for Supervised Dimension Reduction, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sufficient Component Analysis for Supervised Dimension Reduction will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-571282