4.3 Article

An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization

期刊

RAIRO-OPERATIONS RESEARCH
卷 56, 期 4, 页码 2403-2424

出版社

EDP SCIENCES S A
DOI: 10.1051/ro/2022107

关键词

Approximately optimal stepsize; gradient method; regularization method; Barzilai-Borwein (BB) method; global convergence

资金

  1. National Science Foundation of China [11901561]
  2. Guizhou Provincial Science and Technology Projects [QKHJCZK[2022]YB084]

向作者/读者索取更多资源

This paper proposes an efficient gradient method with approximately optimal step sizes for unconstrained optimization using regularization models. Numerical experiments show the promising performance and efficiency of the proposed method.
It is widely accepted that the stepsize is of great significance to gradient method. An efficient gradient method with approximately optimal stepsizes mainly based on regularization models is proposed for unconstrained optimization. More specifically, if the objective function is not close to a quadratic function on the line segment between the current and latest iterates, regularization model is exploited carefully to generate approximately optimal stepsize. Otherwise, quadratic approximation model is used. In addition, when the curvature is non-positive, special regularization model is developed. The convergence of the proposed method is established under some weak conditions. Extensive numerical experiments indicated the proposed method is very promising. Due to the surprising efficiency, we believe that gradient methods with approximately optimal stepsizes can become strong candidates for large-scale unconstrained optimization.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据