4.7 Article

A double-layer attention based adversarial network for partial transfer learning in machinery fault diagnosis

Journal

COMPUTERS IN INDUSTRY
Volume 127, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.compind.2021.103399

Keywords

Partial transfer learning; Mechanical fault diagnosis; Generative adversarial network; Domain adaptation

Funding

  1. National Natural Science Foundation of China [51775343]
  2. Shanghai Pujiang Program [18PJC031]

Ask authors/readers for more resources

Recently, deep transfer learning approaches have been widely developed for mechanical fault diagnosis issue. DA-GAN model shows great superiority in dealing with mechanical partial transfer problem in both TIM and TDM.
Recently, the deep transfer learning approaches have been widely developed for mechanical fault diagnosis issue, which could identify the health state of unlabeled data in the target domain with the help of knowledge learned from labeled data in the source domain. The tremendous success of these methods is generally based on the assumption that the label spaces across different domains are identical. However, the partial transfer scenario is more common for industrial applications, where the label spaces are not identical. This partial transfer scenario arises a more difficult problem that it is hard to know where to transfer since the shared label spaces are unavailable. To tackle this challenging problem, a double-layer attention based adversarial network (DA-GAN) is proposed in this paper. The proposed method sheds a new angle to deal with the question where to transfer by constructing two attention matrices for domains and samples. These attention matrices could guide the model to know which parts of data should be concentrated or ignored before conducting domain adaptation. Experimental results on both transfer in the identical machine (TIM) and transfer on different machines (TDM) suggest that the DA-GAN model shows great superiority on mechanical partial transfer problem. (C) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available