Statistics – Computation
Scientific paper
2010-07-30
SIAM Journal on Scientific Computing, 33 (5): 2580-2594, 2011
Statistics
Computation
17 pages, 3 figures (each with 2 or 3 subfigures), 2 tables (each with 2 subtables)
Scientific paper
Recently popularized randomized methods for principal component analysis (PCA) efficiently and reliably produce nearly optimal accuracy --- even on parallel processors --- unlike the classical (deterministic) alternatives. We adapt one of these randomized methods for use with data sets that are too large to be stored in random-access memory (RAM). (The traditional terminology is that our procedure works efficiently "out-of-core.") We illustrate the performance of the algorithm via several numerical examples. For example, we report on the PCA of a data set stored on disk that is so large that less than a hundredth of it can fit in our computer's RAM.
Halko Nathan
Martinsson Per-Gunnar
Shkolnisky Yoel
Tygert Mark
No associations
LandOfFree
An algorithm for the principal component analysis of large data sets does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with An algorithm for the principal component analysis of large data sets, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and An algorithm for the principal component analysis of large data sets will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-701289