4.6 Article

Deep attention based music genre classification

Journal

NEUROCOMPUTING
Volume 372, Issue -, Pages 84-91

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2019.09.054

Keywords

Music genre classification; Deep neural networks; Serial attention; Parallelized attention

Funding

  1. National Key Research and Development Program of China [2017YFB130 020 0]
  2. National Natural Science Foundation of P. R. China [61602082, 61672130]

Ask authors/readers for more resources

As an important component of music information retrieval, music genre classification attracts great attentions these years. Benefitting from the outstanding performance of deep neural networks in computer vision, some researchers apply CNN on music genre classification tasks with audio spectrograms as input instead, which has similarities with RGB images. These methods are based on a latent assumption that spectrums with different temporal steps have equal importance. However, it goes against the theory of processing bottleneck in psychology as well as our observation from audio spectrograms. By considering the differences of spectrums, we propose a new model incorporating with attention mechanism based on Bidirectional Recurrent Neural Network. Furthermore, two attention-based models (serial attention and parallelized attention) are implemented in this paper. Comparing with serial attention, parallelized attention is more flexible and gets better results in our experiments. Especially, the CNN-based parallelized attention models with taking STFT spectrograms as input outperform the previous work. (C) 2019 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available