4.6 Article

CONVERGENCE RATE ANALYSIS OF A SEQUENTIAL CONVEX PROGRAMMING METHOD WITH LINE SEARCH FOR A CLASS OF CONSTRAINED DIFFERENCE-OF-CONVEX OPTIMIZATION PROBLEMS

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 31, 期 3, 页码 2024-2054

出版社

SIAM PUBLICATIONS
DOI: 10.1137/20M1314057

关键词

constrained optimization; difference-of-convex optimization; Kurdyka-Lojasiewicz property; linear convergence

资金

  1. Hong Kong Research Grants Council [PolyU153005/17p]

向作者/读者索取更多资源

This paper studies the sequential convex programming method with monotone line search (SCPls) for a class of difference-of-convex optimization problems with smooth inequality constraints. The convergence rate of the sequence generated by SCPls in nonconvex and convex settings under suitable Kurdyka-Lojasiewicz (KL) assumptions is analyzed. The paper also discusses how to deduce the KL exponent of the extended objective function from its Lagrangian in convex settings.
In this paper, we study the sequential convex programming method with monotone line search (SCPls) in [Z. Lu, Sequential Convex Programming Methods for a Class of Structured Nonlinear Programming, https://arxiv.org/abs/1210.3039, 2012] for a class of difference-of-convex optimization problems with multiple smooth inequality constraints. The SCPls is a representative variant of moving-ball-approximation-type algorithms [A. Auslender, R. Shefi, and M. Teboulle, SIAM J. Optim., 20 (2010), pp. 3232-3259; J. Bolte, Z. Chen, and E. Pauwels, Math. Program., 182 (2020) pp. 1-36; J. Bolte and E. Pauwels, Math. Oper. Res., 41 (2016), pp. 442-465; R. Shefi and M. Teboulle, Math. Program., 159 (2016), pp. 137-164] for constrained optimization problems. We analyze the convergence rate of the sequence generated by SCPls in both nonconvex and convex settings by imposing suitable Kurdyka-Lojasiewicz (KL) assumptions. Specifically, in the nonconvex settings, we assume that a special potential function related to the objective and the constraints is a KL function, while in the convex settings we impose KL assumptions directly on the extended objective function (i.e., sum of the objective and the indicator function of the constraint set). A relationship between these two different KL assumptions is established in the convex settings under additional differentiability assumptions. We also discuss how to deduce the KL exponent of the extended objective function from its Lagrangian in the convex settings, under additional assumptions on the constraint functions. Thanks to this result, the extended objectives of some constrained optimization models such as minimizing l(1) subject to logistic/Poisson loss are found to be KL functions with exponent 1/2 under mild assumptions. To illustrate how our results can be applied, we consider SCPls for minimizing l(1-2) [P. Yin, Y. Lou, Q. He, and J. Xin, SIAM J. Sci. Comput., 37 (2015), pp. A536-A563] subject to residual error measured by l(2) norm/Lorentzian norm [R. E. Carrillo, A. B. Ramirez, G. R. Arce, K. E. Barner, and B. M. Sadler, EURASIP J. Adv. Signal Process. (2016), 108]. We first discuss how the various conditions required in our analysis can be verified, and then perform numerical experiments to illustrate the convergence behaviors of SCPls.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据