4.7 Article

A novel time-frequency Transformer based on self-attention mechanism and its application in fault diagnosis of rolling bearings

Journal

MECHANICAL SYSTEMS AND SIGNAL PROCESSING
Volume 168, Issue -, Pages -

Publisher

ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD
DOI: 10.1016/j.ymssp.2021.108616

Keywords

Fault diagnosis; Deep learning; Transformer; Self-attention mechanism; Rolling bearings

Funding

  1. National Natural Science Foundation of China [52075095]

Ask authors/readers for more resources

This paper introduces a novel time-frequency Transformer model to address the shortcomings of traditional models, demonstrating superior fault diagnosis performance in comparison with benchmark models and other state-of-the-art methods.
The scope of data-driven fault diagnosis models is greatly extended through deep learning (DL). However, the classical convolution and recurrent structure have their defects in computational efficiency and feature representation, while the latest Transformer architecture based on attention mechanism has not yet been applied in this field. To solve these problems, we propose a novel time-frequency Transformer (TFT) model inspired by the massive success of vanilla Transformer in sequence processing. Specially, we design a fresh tokenizer and encoder module to extract effective abstractions from the time-frequency representation (TFR) of vibration signals. On this basis, a new end-to-end fault diagnosis framework based on time-frequency Transformer is presented in this paper. Through the case studies on bearing experimental datasets, we construct the optimal Transformer structure and verify its fault diagnosis performance. The superiority of the proposed method is demonstrated in comparison with the benchmark models and other state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available