HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

Mathematics – Optimization and Control

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

22 pages, 10 figures

Scientific paper

Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve state-of-the-art performance on a variety of machine learning tasks. Several researchers have recently proposed schemes to parallelize SGD, but all require performance-destroying memory locking and synchronization. This work aims to show using novel theoretical analysis, algorithms, and implementation that SGD can be implemented without any locking. We present an update scheme called HOGWILD! which allows processors access to shared memory with the possibility of overwriting each other's work. We show that when the associated optimization problem is sparse, meaning most gradient updates only modify small parts of the decision variable, then HOGWILD! achieves a nearly optimal rate of convergence. We demonstrate experimentally that HOGWILD! outperforms alternative schemes that use locking by an order of magnitude.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-41854

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.