4.8 Article

CEFL: Online Admission Control, Data Scheduling, and Accuracy Tuning for Cost-Efficient Federated Learning Across Edge Nodes

期刊

IEEE INTERNET OF THINGS JOURNAL
卷 7, 期 10, 页码 9341-9356

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JIOT.2020.2984332

关键词

Computational modeling; Data models; Artificial intelligence; Cloud computing; Training; Internet of Things; Load modeling; Distributed learning; edge computing; edge intelligence; federated learning; online scheduling

资金

  1. National Key Research and Development Program of China [2017YFB1001703]
  2. National Science Foundation of China [61802449, 61702287, 61802018]
  3. Guangdong Natural Science Funds [2018A030313032]
  4. Program for Basic and Applied Basic Research Fund of Guangdong [2019A1515010030]
  5. Natural Science Foundation of Tianjin [18JCQNJC00200]
  6. Young Elite Scientists Sponsorship Program by Tianjin [TJSQNTJ-2018-19]
  7. Beijing Institute of Technology Research Fund Program

向作者/读者索取更多资源

With the proliferation of Internet of Things (IoT), zillions of bytes of data are generated at the network edge, incurring an urgent need to push the frontiers of artificial intelligence (AI) to network edge so as to fully unleash the potential of the IoT big data. To materialize such a vision which is known as edge intelligence, federated learning is emerging as a promising solution to enable edge nodes to collaboratively learn a shared model in a privacy-preserving and communication-efficient manner, by keeping the data at the edge nodes. While pilot efforts on federated learning have mostly focused on reducing the communication overhead, the computation efficiency of those resource-constrained edge nodes has been largely overlooked. To bridge this gap, in this article, we investigate how to coordinate the edge and the cloud to optimize the system-wide cost efficiency of federated learning. Leveraging the Lyapunov optimization theory, we design and analyze a cost-efficient optimization framework CEFL to make online yet near-optimal control decisions on admission control, load balancing, data scheduling, and accuracy tuning for the dynamically arrived training data samples, reducing both computation and communication cost. In particular, our control framework CEFL can be flexibly extended to incorporate various design choices and practical requirements of federated learning, such as exploiting the cheaper cloud resource for model training with better cost efficiency yet still facilitating on-demand privacy preservation. Via both rigorous theoretical analysis and extensive trace-driven evaluations, we verify the cost efficiency of our proposed CEFL framework.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据