期刊
MECHANICAL SYSTEMS AND SIGNAL PROCESSING
卷 168, 期 -, 页码 -出版社
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD
DOI: 10.1016/j.ymssp.2021.108616
关键词
Fault diagnosis; Deep learning; Transformer; Self-attention mechanism; Rolling bearings
资金
- National Natural Science Foundation of China [52075095]
This paper introduces a novel time-frequency Transformer model to address the shortcomings of traditional models, demonstrating superior fault diagnosis performance in comparison with benchmark models and other state-of-the-art methods.
The scope of data-driven fault diagnosis models is greatly extended through deep learning (DL). However, the classical convolution and recurrent structure have their defects in computational efficiency and feature representation, while the latest Transformer architecture based on attention mechanism has not yet been applied in this field. To solve these problems, we propose a novel time-frequency Transformer (TFT) model inspired by the massive success of vanilla Transformer in sequence processing. Specially, we design a fresh tokenizer and encoder module to extract effective abstractions from the time-frequency representation (TFR) of vibration signals. On this basis, a new end-to-end fault diagnosis framework based on time-frequency Transformer is presented in this paper. Through the case studies on bearing experimental datasets, we construct the optimal Transformer structure and verify its fault diagnosis performance. The superiority of the proposed method is demonstrated in comparison with the benchmark models and other state-of-the-art methods.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据