4.4 Article

Resolving the time course of visual and auditory object categorization

期刊

JOURNAL OF NEUROPHYSIOLOGY
卷 127, 期 6, 页码 1622-1628

出版社

AMER PHYSIOLOGICAL SOC
DOI: 10.1152/jn.00515.2021

关键词

auditory modality; EEG; MVPA; object categorization; visual modality

资金

  1. Berlin School of Mind and Brain PhD scholarship
  2. Einstein Center for Neurosciences Berlin
  3. Deutsche Forschungsgemeinschaft (DFG) [KA4683/2-1, CI241/1-1, CI241/3-1, CI241/7-1]
  4. European Research Council (ERC)

向作者/读者索取更多资源

This study investigates the extraction of object category information in the auditory modality, using EEG and multivariate pattern analysis. The results show that auditory object category representations can be reliably extracted from EEG signals, and that there is a similar representational transition between the auditory and visual modalities.
Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study, we used EEG (n = 48) and time-resolved multivariate pattern analysis to investigate 1) the time course with which object category information emerges in the auditory modality and 2) how the representational transition from individual object identification to category representation compares between the auditory modality and the visual modality. Our results show that 1) auditory object category representations can be reliably extracted from EEG signals and 2) a similar representational transition occurs in the visual and auditory modalities, where an initial representation at the individual-object level is followed by a subsequent representation of the objects' category membership. Altogether, our results suggest an analogous hierarchy of information processing across sensory channels. However, there was no convergence toward conceptual modality-independent representations, thus providing no evidence for a shared supramodal code. NEW & NOTEWORTHY Object categorization operates on inputs from different sensory modalities, such as vision and audition. This process was mainly studied in vision. Here, we explore auditory object categorization. We show that auditory object category representations can be reliably extracted from EEG signals and, similar to vision, auditory representations initially carry information about individual objects, which is followed by a subsequent representation of the objects' category membership.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据