Journal
IEEE NETWORK
Volume 33, Issue 6, Pages 180-187Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/MNET.2019.1900025
Keywords
Servers; Reliability; Computational modeling; Bandwidth; Privacy; Knowledge engineering; Convergence
Categories
Funding
- UBC FourYear Doctoral Fellowship
- Natural Science and Engineering Research Council of Canada
- National Engineering Laboratory for Big Data System Computing Technology at Shenzhen University, China
Ask authors/readers for more resources
Privacy concerns exist when the central server has copies of datasets. Hence, there is a paradigm shift for learning networks to change from centralized in-cloud learning to distributed on-device learning. Benefitting from parallel computing, on-device learning networks have a lower bandwidth requirement than in-cloud learning networks. Moreover, on-device learning networks also have several desirable characteristics such as privacy preserving and flexibility. However, on-device learning networks are vulnerable to the malfunctioning terminals across the networks. The worst-case malfunctioning terminals are the Byzantine adversaries, that can perform arbitrary harmful operations to compromise the learned model based on the full knowledge of the networks. Hence, the design of secure learning algorithms becomes an emerging topic in the on-device learning networks with Byzantine adversaries. In this article, we present a comprehensive overview of the prevalent secure learning algorithms for the two promising on-device learning networks: Federated-Learning networks and decentralized-learning networks. We also review several future research directions in the Federated- Learning and decentralized-learning networks.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available