4.5 Article

Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions

Journal

NEUROPSYCHOLOGIA
Volume 57, Issue -, Pages 71-77

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neuropsychologia.2014.02.004

Keywords

Audio-visual speech perception; Audio-haptic speech perception; Multisensory interactions; EEG

Funding

  1. Centre National de la Recherche Scientifique (CNRS)

Ask authors/readers for more resources

Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions. (C) 2014 Published by Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available