Journal
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
Volume 16, Issue 10, Pages 6532-6542Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TII.2019.2945367
Keywords
Training; Privacy; Cloud computing; Servers; Differential privacy; Informatics; Federated learning (FL); industrial artificial intelligence; privacy protection
Categories
Funding
- National Key R&D Program of China [2017YFB0802300, 2017YFB0802000]
- National Natural Science Foundation of China [61972454, 61802051, 61772121, 61728102, 61472065]
- Peng Cheng Laboratory Project of Guangdong Province [PCL2018KP004]
- Guangxi Key Laboratory of Cryptography and Information Security [GCIS201804, TII-19-2524]
Ask authors/readers for more resources
By leveraging deep learning-based technologies, industrial artificial intelligence (IAI) has been applied to solve various industrial challenging problems in Industry 4.0. However, for privacy reasons, traditional centralized training may be unsuitable for sensitive data-driven industrial scenarios, such as healthcare and autopilot. Recently, federated learning has received widespread attention, since it enables participants to collaboratively learn a shared model without revealing their local data. However, studies have shown that, by exploiting the shared parameters adversaries can still compromise industrial applications such as auto-driving navigation systems, medical data in wearable devices, and industrial robots' decision making. In this article, to solve this problem, we propose an efficient and privacy-enhanced federated learning (PEFL) scheme for IAI. Compared with existing solutions, PEFL is noninteractive, and can prevent private data from being leaked even if multiple entities collude with each other. Moreover, extensive experiments with real-world data demonstrate the superiority of PEFL in terms of accuracy and efficiency.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available