4.7 Article

Dyformer: A dynamic transformer-based architecture for multivariate time series classification

Journal

INFORMATION SCIENCES
Volume 656, Issue -, Pages -

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2023.119881

Keywords

Multivariate time series classification; Data mining; Deep learning

Ask authors/readers for more resources

This paper proposes a dynamic transformer-based architecture called Dyformer for multivariate time series classification. Dyformer captures multi-scale features through hierarchical pooling and adaptive learning strategies, and improves model performance by introducing feature-map-wise attention mechanisms and a joint loss function.
Multivariate time series classification is a crucial task with applications in broad areas such as finance, medicine, and engineering. Transformer is promising for time series classification, but as a generic approach, they have limited capability to effectively capture the distinctive characteristics inherent in time series data and adapt to diverse architectural requirements. This paper proposes a novel dynamic transformer-based architecture called Dyformer to address the above limitations of traditional transformers in multivariate time series classification. Dyformer incorporates hierarchical pooling to decompose time series into subsequences with different frequency components. Then, it employs Dyformer modules to achieve adaptive learning strategies for different frequency components based on a dynamic architecture. Furthermore, we introduce feature-map-wise attention mechanisms to capture multi-scale temporal dependencies and a joint loss function to facilitate model training. To evaluate the performance of Dyformer, we conducted extensive experiments using 30 benchmark datasets. The results unequivocally demonstrate that our model consistently outperforms a multitude of state-of-the-art methods and baseline approaches. Our model also copes well with limited training samples when pre-trained.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available