4.7 Article

An Efficient Federated Distillation Learning System for Multitask Time Series Classification

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2022.3201203

Keywords

Task analysis; Feature extraction; Heuristic algorithms; Servers; Multitasking; Time series analysis; Instruments; Data mining; deep learning; federated learning (FL); knowledge distillation; time series classification (TSC)

Funding

  1. Natural Science Foundation of Sichuan Province [2022NSFSC0568]
  2. National Natural Science Foundation of China [62172342]
  3. Fundamental Research Funds for the Central Universities, China

Ask authors/readers for more resources

This article proposes an efficient federated distillation learning system (EFDLS) for multitask time series classification (TSC). It introduces two novel components: a feature-based student-teacher (FBST) framework and a distance-based weights matching (DBWM) scheme. Experimental results demonstrate that EFDLS outperforms other federated learning algorithms in multiple datasets and achieves higher mean accuracy compared to a single-task baseline.
This article proposes an efficient federated distillation learning system (EFDLS) for multitask time series classification (TSC). EFDLS consists of a central server and multiple mobile users, where different users may run different TSC tasks. EFDLS has two novel components: a feature-based student-teacher (FBST) framework and a distance-based weights matching (DBWM) scheme. For each user, the FBST framework transfers knowledge from its teacher's hidden layers to its student's hidden layers via knowledge distillation, where the teacher and student have identical network structures. For each connected user, its student model's hidden layers' weights are uploaded to the EFDLS server periodically. The DBWM scheme is deployed on the server, with the least square distance (LSD) used to measure the similarity between the weights of two given models. This scheme finds a partner for each connected user such that the user's and its partner's weights are the closest among all the weights uploaded. The server exchanges and sends back the user's and its partner's weights to these two users which then load the received weights to their teachers' hidden layers. Experimental results show that compared with a number of state-of-the-art federated learning (FL) algorithms, our proposed EFDLS wins 20 out of 44 standard UCR2018 datasets and achieves the highest mean accuracy (70.14%) on these datasets. In particular, compared with a single-task baseline, EFDLS obtains 32/4/8 regarding win/tie/lose and results in an improvement of approximately 4% in terms of mean accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available