4.7 Article

Decentralized ADMM with compressed and event-triggered communication

期刊

NEURAL NETWORKS
卷 165, 期 -, 页码 472-482

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2023.06.001

关键词

Decentralized optimization; ADMM; Efficient communication; Second-order algorithms

向作者/读者索取更多资源

This paper proposes a decentralized second-order communication-efficient algorithm called CC-DQM, which combines event-triggered communication with compressed communication to solve the decentralized optimization problem. The theoretical analysis shows that the proposed algorithm can achieve exact linear convergence even with compression error and intermittent communication if the local objective functions are strongly convex and smooth. Numerical experiments demonstrate its satisfactory communication efficiency.
This paper considers the decentralized optimization problem, where agents in a network cooperate to minimize the sum of their local objective functions by communication and local computation. We propose a decentralized second-order communication-efficient algorithm called communication-censored and communication-compressed quadratically approximated alternating direction method of multipliers (ADMM), termed as CC-DQM, by combining event-triggered communication with compressed communication. In CC-DQM, agents are allowed to transmit the compressed message only when the current primal variables have changed greatly compared to its last estimate. Moreover, to relieve the computation cost, the update of Hessian is also scheduled by the trigger condition. Theoretical analysis shows that the proposed algorithm can still maintain an exact linear convergence, despite the existence of compression error and intermittent communication, if the local objective functions are strongly convex and smooth. Finally, numerical experiments demonstrate its satisfactory communication efficiency.(C) 2023 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据