期刊
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS
卷 41, 期 4, 页码 977-989出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSAC.2023.3242702
关键词
Servers; Load modeling; Computational modeling; Cryptography; Data models; Symbols; Protocols; Federated learning; secure aggregation; secret sharing; dropout resiliency; optimal communication load
We propose SwiftAgg+, a novel secure aggregation protocol that aggregates local models from N × N distributed users in federated learning systems. SwiftAgg+ achieves optimal communication loads and provides information theoretic security guarantees. It also allows for a flexible trade-off between communication loads and the number of active communication links.
We propose SwiftAgg+, a novel secure aggregation protocol for federated learning systems, where a central server aggregates local models of N ? N distributed users, each of size L ? N, trained on their local data, in a privacy preserving manner. SwiftAgg+ can significantly reduce the communication overheads without any compromise on security, and achieve optimal communication loads within diminishing gaps. Specifically, in presence of at most D = o(N) dropout users, SwiftAgg+ achieves a per-user communication load of (1 + O(1/N ))L symbols and a server communication load of (1 + O(1/N ))L symbols, with a worst-case information theoretic security guarantee, against any subset of up to T = o(N) semi-honest users who may also collude with the curious server. Moreover, the proposed SwiftAgg+ allows for a flexible trade-off between communication loads and the number of active communication links. In particular, for T < N - D and for any K ? N, SwiftAgg+ can achieve the server communication load of (1 + T/K)L symbols, and per-user communication load of up to (1 + T +D/K )L symbols, where the number of pair-wise active connections in the network is N/2 (K + T + D + 1).
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据