Journal
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
Volume -, Issue -, Pages -Publisher
IEEE
DOI: 10.1109/IJCNN52387.2021.9534392
Keywords
Data mining; time series classification; deep learning; neural architecture search; convolutional neural network; LSTM
Ask authors/readers for more resources
In this paper, a robust neural temporal search (RNTS) framework is proposed for analyzing TSC data, which combines temporal search network and attentional LSTM network to extract basic features and explore complex relationships. Experimental results show that the framework outperforms several state-of-the-art approaches on 24 standard datasets from the UCR 2018 archive in terms of top-1 accuracy-based measures.
Over the years, a large number of deep learning algorithms have been developed for time series classification (TSC). These algorithms were usually invented by researchers with prior knowledge and experience. However, it is a critical challenge for beginners to design decent structures to address various TSC problems. To this end, we propose a robust neural temporal search (RNTS) framework for identifying the relationships and features in TSC data, which mainly contains a temporal search network and an attentional LSTM network. To be specific, inspired by the idea of neural architecture search (NAS), the temporal search network automatically transforms its structure for each dataset according to its characteristics, responsible for extracting basic features. The attentional LSTM network is used to explore the complex shapelets and relationships the former may ignore. Experimental results demonstrate that RNTS achieves the best overall performance on 24 standard datasets selected from the UCR 2018 archive, in terms of three measures based on the top-1 accuracy, compared with a number of state-of-the-art approaches.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available