Computer Science – Learning
Scientific paper
2010-06-25
The 26th Conference on Uncertainty in Artificial Intelligence (UAI 2010), Catalina Island, California, July 8-11, 2010
Computer Science
Learning
Scientific paper
Designing and implementing efficient, provably correct parallel machine learning (ML) algorithms is challenging. Existing high-level parallel abstractions like MapReduce are insufficiently expressive while low-level tools like MPI and Pthreads leave ML experts repeatedly solving the same design challenges. By targeting common patterns in ML, we developed GraphLab, which improves upon abstractions like MapReduce by compactly expressing asynchronous iterative algorithms with sparse computational dependencies while ensuring data consistency and achieving a high degree of parallel performance. We demonstrate the expressiveness of the GraphLab framework by designing and implementing parallel versions of belief propagation, Gibbs sampling, Co-EM, Lasso and Compressed Sensing. We show that using GraphLab we can achieve excellent parallel performance on large scale real-world problems.
Bickson Danny
Gonzalez Joseph
Guestrin Carlos
Hellerstein Joseph M.
Kyrola Aapo
No associations
LandOfFree
GraphLab: A New Framework for Parallel Machine Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with GraphLab: A New Framework for Parallel Machine Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and GraphLab: A New Framework for Parallel Machine Learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-310200