Mathematics – Statistics Theory
Scientific paper
2009-12-27
Mathematics
Statistics Theory
Appeared as Stat. technical report, UC Berkeley
Scientific paper
High-dimensional inference refers to problems of statistical estimation in which the ambient dimension of the data may be comparable to or possibly even larger than the sample size. We study an instance of high-dimensional inference in which the goal is to estimate a matrix $\Theta^* \in \real^{k \times p}$ on the basis of $N$ noisy observations, and the unknown matrix $\Theta^*$ is assumed to be either exactly low rank, or ``near'' low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider an $M$-estimator based on regularization by the trace or nuclear norm over matrices, and analyze its performance under high-dimensional scaling. We provide non-asymptotic bounds on the Frobenius norm error that hold for a general class of noisy observation models, and then illustrate their consequences for a number of specific matrix models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low-rank matrices from random projections. Simulation results show excellent agreement with the high-dimensional scaling of the error predicted by our theory.
Negahban Sahand
Wainwright Martin J.
No associations
LandOfFree
Estimation of (near) low-rank matrices with noise and high-dimensional scaling does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Estimation of (near) low-rank matrices with noise and high-dimensional scaling, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Estimation of (near) low-rank matrices with noise and high-dimensional scaling will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-568687