Hybrid Deterministic-Stochastic Methods for Data Fitting

Computer Science – Numerical Analysis

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

26 pages. Expanded Section 2.3 (Relaxing strong convexity)

Scientific paper

Many structured data-fitting applications require the solution of an optimization problem involving a sum over a potentially large number of measurements. Incremental gradient algorithms offer inexpensive iterations by sampling a subset of the terms in the sum. These methods can make great progress initially, but often slow as they approach a solution. In contrast, full-gradient methods achieve steady convergence at the expense of evaluating the full objective and gradient on each iteration. We explore hybrid methods that exhibit the benefits of both approaches. Rate-of-convergence analysis shows that by controlling the sample size in an incremental gradient algorithm, it is possible to maintain the steady convergence rates of full-gradient methods. We detail a practical quasi-Newton implementation based on this approach. Numerical experiments illustrate its potential benefits.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Hybrid Deterministic-Stochastic Methods for Data Fitting does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Hybrid Deterministic-Stochastic Methods for Data Fitting, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Hybrid Deterministic-Stochastic Methods for Data Fitting will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-217706

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.