Journal
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS
Volume -, Issue -, Pages -Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSMC.2023.3293462
Keywords
Index Terms-Classification; concept drift; data streams; fed-erated learning (FL); prototype learning
Ask authors/readers for more resources
Distributed data stream mining is gaining attention due to the collection of huge amounts of streaming data from different locations. However, privacy concerns have received little investigation in existing studies. This article presents FedStream, a federated learning framework that captures evolving concepts on distributed concept-drifting data streams while preserving privacy among participating clients. Extensive experiments demonstrate the superiority of FedStream over state-of-the-art methods.
Distributed data stream mining has gained increasing attention in recent years since many organizations collect tremendous amounts of streaming data from different locations. Existing studies mainly focus on learning evolving concepts on distributed data streams, while the privacy issue is little investigated. In this article, for the first time, we develop a federated learning framework for distributed concept-drifting data streams, called FedStream. The proposed method allows capturing the evolving concepts by dynamically maintaining a set of prototypes with error-driven representative learning. Meanwhile, a new metric-learning-based prototype transformation technique is introduced to preserve privacy among participating clients in the distributed data streams setting. Extensive experiments on both real-world and synthetic datasets have demonstrated the superiority of FedStream, and it even achieves competitive performance with state-of-the-art distributed learning methods.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available