4.7 Article

Dyformer: A dynamic transformer-based architecture for multivariate time series classification

期刊

INFORMATION SCIENCES
卷 656, 期 -, 页码 -

出版社

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2023.119881

关键词

Multivariate time series classification; Data mining; Deep learning

向作者/读者索取更多资源

This paper proposes a dynamic transformer-based architecture called Dyformer for multivariate time series classification. Dyformer captures multi-scale features through hierarchical pooling and adaptive learning strategies, and improves model performance by introducing feature-map-wise attention mechanisms and a joint loss function.
Multivariate time series classification is a crucial task with applications in broad areas such as finance, medicine, and engineering. Transformer is promising for time series classification, but as a generic approach, they have limited capability to effectively capture the distinctive characteristics inherent in time series data and adapt to diverse architectural requirements. This paper proposes a novel dynamic transformer-based architecture called Dyformer to address the above limitations of traditional transformers in multivariate time series classification. Dyformer incorporates hierarchical pooling to decompose time series into subsequences with different frequency components. Then, it employs Dyformer modules to achieve adaptive learning strategies for different frequency components based on a dynamic architecture. Furthermore, we introduce feature-map-wise attention mechanisms to capture multi-scale temporal dependencies and a joint loss function to facilitate model training. To evaluate the performance of Dyformer, we conducted extensive experiments using 30 benchmark datasets. The results unequivocally demonstrate that our model consistently outperforms a multitude of state-of-the-art methods and baseline approaches. Our model also copes well with limited training samples when pre-trained.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据