4.6 Article

HYBRID DETERMINISTIC-STOCHASTIC METHODS FOR DATA FITTING

Journal

SIAM JOURNAL ON SCIENTIFIC COMPUTING
Volume 34, Issue 3, Pages A1380-A1405

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/110830629

Keywords

optimization; data fitting; incremental gradient; gradient descent

Funding

  1. NSERC [312104]
  2. CRD DNOISE II NSERC grant
  3. SIERRA from the European Research Council [SIERRA-ERC-239993]

Ask authors/readers for more resources

Many structured data-fitting applications require the solution of an optimization problem involving a sum over a potentially large number of measurements. Incremental gradient algorithms offer inexpensive iterations by sampling a subset of the terms in the sum; these methods can make great progress initially, but often slow as they approach a solution. In contrast, full-gradient methods achieve steady convergence at the expense of evaluating the full objective and gradient on each iteration. We explore hybrid methods that exhibit the benefits of both approaches. Rate-of-convergence analysis shows that by controlling the sample size in an incremental-gradient algorithm, it is possible to maintain the steady convergence rates of full-gradient methods. We detail a practical quasi-Newton implementation based on this approach. Numerical experiments illustrate its potential benefits.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available