Journal
IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS)
Volume -, Issue -, Pages -Publisher
IEEE
DOI: 10.1109/INFOCOMWKSHPS54753.2022.9798021
Keywords
Semi-federated learning; convergence analysis; model aggregation; transceiver design
Categories
Funding
- National Key R&D Program of China [2020YFB1807801]
Ask authors/readers for more resources
This paper proposes a SemiFL framework for cellular-based federated learning, which addresses the waste of computing resources at the base station by simultaneously sending gradient updates and training samples. The proposed framework improves accuracy and convergence speed compared to conventional FL.
In cellular-based federated learning (FL), the base station (BS) is only used to aggregate parameters, which incurs a waste of computing resources at the BS. In this paper, a novel semi-federated learning (SemiFL) framework is proposed to break this bottleneck, where local devices simultaneously send their gradient updates and training samples to the BS for global model computation. To capture the performance of SemiFL over wireless networks, a closed-form convergence upper bound of SemiFL is derived. Then, a non-convex problem is formulated to improve the convergence behavior of SemiFL, subject to the transmit power, communication latency, and computation distortion. To solve this intractable problem, a two-stage algorithm is proposed by controlling the transmit power and receive beamformers. Numerical experiments validate that the proposed SemiFL framework can effectively improve accuracy and accelerate convergence as compared to conventional FL.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available