期刊
SIAM JOURNAL ON OPTIMIZATION
卷 14, 期 3, 页码 807-840出版社
SIAM PUBLICATIONS
DOI: 10.1137/S1052623400376366
关键词
nondifferentiable optimization; convex programming; subgradient optimization; approximate subgradients; efficiency
We present a unified convergence framework for approximate subgradient methods that covers various stepsize rules (including both diminishing and nonvanishing stepsizes), convergence in objective values, and convergence to a neighborhood of the optimal set. We discuss ways of ensuring the boundedness of the iterates and give efficiency estimates. Our results are extended to incremental subgradient methods for minimizing a sum of convex functions, which have recently been shown to be promising for various large-scale problems, including those arising from Lagrangian relaxation.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据