4.2 Article

Evidence for children's online integration of simultaneous information from speech and iconic gestures: an ERP study

Journal

LANGUAGE COGNITION AND NEUROSCIENCE
Volume 35, Issue 10, Pages 1283-1294

Publisher

ROUTLEDGE JOURNALS, TAYLOR & FRANCIS LTD
DOI: 10.1080/23273798.2020.1737719

Keywords

Multimodal integration; co-speech gestures; children; ERPs; N400

Funding

  1. European Commission, Marie Curie Actions [H2020-MSCA-IF-2015]

Ask authors/readers for more resources

Children perceive iconic gestures, along with speech they hear. Previous studies have shown that children integrate information from both modalities. Yet it is not known whether children can integrate both types of information simultaneously as soon as they are available (as adults do) or whether they initially process them separately and integrate them later. Using electrophysiological measures, we examined the online neurocognitive processing of gesture-speech integration in 6- to 7-year-old children. We focused on the N400 event-related potential component which is modulated by semantic integration load. Children watched video clips of matching or mismatching gesture-speech combinations, which varied the semantic integration load. The ERPs showed that the amplitude of the N400 was larger in the mismatching condition than in the matching condition. This finding provides the first neural evidence that by the ages of 6 or 7, children integrate multimodal semantic information in an online fashion comparable to that of adults.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available