4.3 Article

Sensorimotor simulation and emotion processing: Impairing facial action increases semantic retrieval demands

Journal

COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE
Volume 17, Issue 3, Pages 652-664

Publisher

SPRINGER
DOI: 10.3758/s13415-017-0503-2

Keywords

Emotion; ERP; Embodied cognition

Ask authors/readers for more resources

Sensorimotor models suggest that understanding the emotional content of a face recruits a simulation process in which a viewer partially reproduces the facial expression in their own sensorimotor system. An important prediction of these models is that disrupting simulation should make emotion recognition more difficult. Here we used electroencephalogram (EEG) and facial electromyogram (EMG) to investigate how interfering with sensorimotor signals from the face influences the real-time processing of emotional faces. EEG and EMG were recorded as healthy adults viewed emotional faces and rated their valence. During control blocks, participants held a conjoined pair of chopsticks loosely between their lips. During interference blocks, participants held the chopsticks horizontally between their teeth and lips to generate motor noise on the lower part of the face. This noise was confirmed by EMG at the zygomaticus. Analysis of EEG indicated that faces expressing happiness or disgust-lower face expressions-elicited larger amplitude N400 when they were presented during the interference than the control blocks, suggesting interference led to greater semantic retrieval demands. The selective impact of facial motor interference on the brain response to lower face expressions supports sensorimotor models of emotion understanding.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available