3.8 Article

Neural organization for recognition of grammatical and emotional facial expressions in deaf ASL signers and hearing nonsigners

Journal

COGNITIVE BRAIN RESEARCH
Volume 22, Issue 2, Pages 193-203

Publisher

ELSEVIER
DOI: 10.1016/j.cogbrainres.2004.08.012

Keywords

fMRI; facial expression; language

Funding

  1. NICHD NIH HHS [HD13249] Funding Source: Medline

Ask authors/readers for more resources

Recognition of emotional facial expressions is universal for all humans, but signed language users must also recognize certain non-affective facial expressions as linguistic markers. fMRI was used to investigate the neural systems underlying recognition of these functionaly distinct expressions, comparing deaf ASL signers and hearing nonsigners. Within the superior temporal sulcus (STS), activation for emotional expressions was right lateralized for the hearing group and bilateral for the deaf group. In contrast, activation within STS for linguistic facial expressions was left lateralized. only for signers and only when linguistic facial expressions co-occurred with verbs. Within the fusiform gyrus (FG), activation was left lateralized for ASL signers for both expression types, whereas activation was bilateral for both expression types for nonsigners. We propose that left lateralization in FG may be due to continuous analysis of local facial features during online sign language processing. The results indicate that function in part drives the lateralization of neural systems that process human facial expressions. (C) 2004 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available