期刊
出版社
IEEE
DOI: 10.1109/INFOCOMWKSHPS54753.2022.9798021
关键词
Semi-federated learning; convergence analysis; model aggregation; transceiver design
类别
资金
- National Key R&D Program of China [2020YFB1807801]
This paper proposes a SemiFL framework for cellular-based federated learning, which addresses the waste of computing resources at the base station by simultaneously sending gradient updates and training samples. The proposed framework improves accuracy and convergence speed compared to conventional FL.
In cellular-based federated learning (FL), the base station (BS) is only used to aggregate parameters, which incurs a waste of computing resources at the BS. In this paper, a novel semi-federated learning (SemiFL) framework is proposed to break this bottleneck, where local devices simultaneously send their gradient updates and training samples to the BS for global model computation. To capture the performance of SemiFL over wireless networks, a closed-form convergence upper bound of SemiFL is derived. Then, a non-convex problem is formulated to improve the convergence behavior of SemiFL, subject to the transmit power, communication latency, and computation distortion. To solve this intractable problem, a two-stage algorithm is proposed by controlling the transmit power and receive beamformers. Numerical experiments validate that the proposed SemiFL framework can effectively improve accuracy and accelerate convergence as compared to conventional FL.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据