4.5 Article

Facial Emotions Are Accurately Encoded in the Neural Signal of Those With Autism Spectrum Disorder: A Deep Learning Approach

出版社

ELSEVIER
DOI: 10.1016/j.bpsc.2021.03.015

关键词

-

资金

  1. National Institute of Mental Health [R01MH110585]
  2. Alan Alda Fund for Communication
  3. American Psychological Association
  4. Association for Psychological Science
  5. American Psychological Foundation
  6. Jefferson Scholars Foundation
  7. International Max Planck Research
  8. National Institute Of Mental Health of the National Institutes of Health [F31MH122091]
  9. Temple University Public Policy Lab Graduate Fellowship
  10. American Psychological Association (APA) Dissertation Research Award
  11. National Science Foundation
  12. Dr. Phillip J Bersh Memorial Student Award
  13. [1531492]

向作者/读者索取更多资源

This study utilized deep convolutional neural networks to investigate whether facial emotion information is encoded in the neural signal of individuals with autism spectrum disorder (ASD). The results revealed that facial emotion information is indeed encoded in the neural signal of individuals with ASD. Therefore, the difficulties observed in facial emotion recognition tasks for individuals with ASD may arise from difficulties in decoding or deployment of facial emotion information within the neural signal. This research has important implications for guiding interventions.
BACKGROUND: Individuals with autism spectrum disorder (ASD) exhibit frequent behavioral deficits in facial emotion recognition (FER). It remains unknown whether these deficits arise because facial emotion information is not encoded in their neural signal or because it is encodes but fails to translate to FER behavior (deployment). This distinction has functional implications, including constraining when differences in social information processing occur in ASD, and guiding interventions (i.e., developing prosthetic FER vs. reinforcing existing skills).METHODS: We utilized a discriminative and contemporary machine learning approach-deep convolutional neural networks-to classify facial emotions viewed by individuals with and without ASD (N = 88) from concurrently recorded electroencephalography signals.RESULTS: The convolutional neural network classified facial emotions with high accuracy for both ASD and non-ASD groups, even though individuals with ASD performed more poorly on the concurrent FER task. In fact, convolutional neural network accuracy was greater in the ASD group and was not related to behavioral performance. This pattern of results replicated across three independent participant samples. Moreover, feature importance analyses suggested that a late temporal window of neural activity (1000-1500 ms) may be uniquely important in facial emotion classification for individuals with ASD. CONCLUSIONS: Our results reveal for the first time that facial emotion information is encoded in the neural signal of individuals with (and without) ASD. Thus, observed difficulties in behavioral FER associated with ASD likely arise from difficulties in decoding or deployment of facial emotion information within the neural signal. Interventions should focus on capitalizing on this intact encoding rather than promoting compensation or FER prostheses.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据