4.7 Article

Accelerating Gossip-Based Deep Learning in Heterogeneous Edge Computing Platforms

期刊

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPDS.2020.3046440

关键词

Training; Data models; Computational modeling; Peer-to-peer computing; Distributed databases; Acceleration; Servers; Deep learning; decentralized training; gossip; edge computing

资金

  1. National Key Research and Development Plan of China [2018YFB1003701, 2018YFB1003700]
  2. National Natural Science Foundation of China [61872337]
  3. Swiss National Science Foundation [NRP75, 407540_167266]
  4. Swiss National Science Foundation (SNF) [407540_167266] Funding Source: Swiss National Science Foundation (SNF)

向作者/读者索取更多资源

Research shows that with the exponential growth of data generated by edge computing, the decentralized and Gossip-based training of deep learning models is gaining momentum. The EdgeGossip framework is designed to reduce the performance variation among heterogeneous edge platforms during training and achieve best possible model accuracy quickly. Implementing EdgeGossip based on popular Gossip algorithms has demonstrated an average reduction of model training time by 2.70 times with only 0.78% accuracy loss.
With the exponential growth of data created at the network edge, decentralized and Gossip-based training of deep learning (DL) models on edge computing (EC) gains tremendous research momentum, owing to its capability to learn from resource-strenuous edge nodes with limited network connectivity. Today's edge devices are extremely heterogeneous, e.g., hardware and software stacks, and result in high performance variation of training time and inducing extra delay to synchronize and converge. The large body of prior art accelerates DL, being data or model parallelization, via a centralized server, e.g., parameter server scheme, which may easily turn into the system bottleneck or single point of failure. In this artice, we propose EdgeGossip, a framework specifically designed to accelerate the training process of decentralized and Gossip-based DL training for heterogeneous EC platforms. EdgeGossip features on: (i) low performance variation among multiple EC platforms during iterative training, and (ii) accuracy-aware training to fastly obtain best possible model accuracy. We implement EdgeGossip based on popular Gossip algorithms and demonstrate its effectiveness using real-world DL workloads, i.e., considerably reducing model training time by an average of 2.70 times while only incurring accuracy losses of 0.78 percent.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据