Mathematics – Statistics Theory
Scientific paper
2011-10-17
Mathematics
Statistics Theory
57 pages
Scientific paper
This article is devoted to optimal dimension reduction methods for sparse, high dimensional multivariate response regression models. Both the number of responses and that of the predictors may exceed the sample size. Sometimes viewed as complementary, predictor selection and rank reduction are the most popular strategies for obtaining lower dimensional approximations of the parameter matrix in such models. We show in this article that important gains in prediction accuracy can be obtained by considering them jointly. For this, we first motivate a new class of sparse multivariate regression models, in which the coefficient matrix has low rank {\bf and} zero rows or can be well approximated by such a matrix. Then, we introduce estimators that are based on penalized least squares, with novel penalties that impose simultaneous row and rank restrictions on the coefficient matrix. We prove that these estimators indeed adapt to the unknown matrix sparsity and have fast rates of convergence. We support our theoretical results with an extensive simulation study and two data analyses.
Bunea Florentina
She Yiyuan
Wegkamp Marten
No associations
LandOfFree
Joint variable and rank selection for parsimonious estimation of high dimensional matrices does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Joint variable and rank selection for parsimonious estimation of high dimensional matrices, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Joint variable and rank selection for parsimonious estimation of high dimensional matrices will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-317160