4.7 Article

Exploring Deep Learning Features for Automatic Classification of Human Emotion Using EEG Rhythms

Journal

IEEE SENSORS JOURNAL
Volume 21, Issue 13, Pages 14923-14930

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSEN.2021.3070373

Keywords

Electroencephalography; Feature extraction; Support vector machines; Erbium; Brain modeling; Continuous wavelet transforms; Filtering; EEG based emotion classification; EEG rhythms; CWT; deep features; pretrained CNN models

Funding

  1. Deanship of Scientific Research (DSR), at King Abdulaziz University, Jeddah [498-135-1442]

Ask authors/readers for more resources

This study aims to develop an efficient deep feature extraction based method for automatically classifying people's emotional status. The experimental results demonstrate that AlexNet features combined with Alpha rhythm perform better in valence discrimination, while MobilNetv2 features yield the highest accuracy score in arousal discrimination.
Emotion recognition (ER) from Electroencephalogram (EEG) signals is a challenging task due to the non-linearity and non-stationarity nature of EEG signals. Existing feature extraction methods cannot extract the deep concealed characteristics of EEG signals from different layers for efficient classification scheme and also hard to select appropriate and effective feature extraction methods for different types of EEG data. Hence this study intends to develop an efficient deep feature extraction based method to automatically classify emotion status of people. In order to discover reliable deep features, five deep convolutional neural networks (CNN) models are considered: AlexNet, VGG16, ResNet50, SqueezeNet and MobilNetv2. Pre-processing, Wavelet Transform (WT), and Continuous Wavelet Transform (CWT) are employed to convert the EEG signals into EEG rhythm images then five well-known pretrained CNN models are employed for feature extraction. Finally, the proposed method puts the obtained features as input to the support vector machine (SVM) method for classifying them into binary emotion classes: valence and arousal classes. The DEAP dataset was used in experimental works. The experimental results demonstrate that the AlexNet features with Alpha rhythm produces better accuracy scores (91.07% in channel Oz) than the other deep features for the valence discrimination, and the MobilNetv2 features yields the highest accuracy score (98.93% in Delta rhythm (with channel C3) for arousal discrimination.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available