4.6 Article

A Federated Learning-Based Patient Monitoring System in Internet of Medical Things

Journal

IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS
Volume 10, Issue 4, Pages 1622-1628

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSS.2022.3228965

Keywords

Clustering; federated learning (FL); Internet of Medical Things (IoMT); knowledge distillation (KD); patient monitoring

Ask authors/readers for more resources

This article proposes a secure patient monitoring system using federated learning, which performs training on local devices and preserves data privacy and security by only sending weight matrices to the server for aggregation. The system intelligently divides participants into clusters based on available resources and trains suitable models on each cluster. High-performing clusters distill knowledge to improve the performance of small-size clusters. Experimental results demonstrate the successful operation of the proposed system under unequal resource conditions.
Patient activities' monitoring is a promising application of the Internet of Medical Things (IoMT), revolutionizing clinical diagnosis. An IoMT uses sensory data collected from smart devices to train a model on the server. The trained model recognizes the patient activities on smart devices. However, training the model on the server has raised privacy concerns and security threats. The sensitive medical data transferred from the smart devices to the Cloud poses different cybersecurity challenges, such as distributed denial of service (DDoS), phishing, network penetration, and side-channel. This article proposes a secure patient monitoring system using federated learning (FL). The system performs training on local devices and sends only weight matrices to the server for aggregation; thus, it preserves data privacy and security compromises. The system intelligently divides the participants into clusters based on the available resources, trains suitable models on each cluster, and enhances the performance via knowledge distillation (KD). The model of high-performing clusters distills knowledge to the model of small-size clusters to improve their performance. The experimental results illustrate that the proposed system successfully work in the presence of unequal resources.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available