4.8 Article

Audiovisual task switching rapidly modulates sound encoding in mouse auditory cortex

期刊

ELIFE
卷 11, 期 -, 页码 -

出版社

eLIFE SCIENCES PUBL LTD
DOI: 10.7554/eLife.75839

关键词

attention; cross-modal; multisensory; cortical layers; behavior; extracellular physiology; Mouse

类别

资金

  1. National Institutes of Health [R01NS116598, R01DC014101, F32DC016846]
  2. National Science Foundation
  3. Hearing Research Incorporated
  4. Klingenstein Foundation
  5. Coleman Memorial Fund

向作者/读者索取更多资源

In everyday behavior, sensory systems compete for attentional resources. This study investigated the cellular and circuit-level mechanisms of modality-selective attention in the mouse auditory cortex. Results showed that attending to sound elements reduced neuronal activity and improved sound encoding efficiency. Attention also filtered out sound-irrelevant background activity and facilitated sound discrimination.
In everyday behavior, sensory systems are in constant competition for attentional resources, but the cellular and circuit-level mechanisms of modality-selective attention remain largely uninvestigated. We conducted translaminar recordings in mouse auditory cortex (AC) during an audiovisual (AV) attention shifting task. Attending to sound elements in an AV stream reduced both pre-stimulus and stimulus-evoked spiking activity, primarily in deep-layer neurons and neurons without spectrotemporal tuning. Despite reduced spiking, stimulus decoder accuracy was preserved, suggesting improved sound encoding efficiency. Similarly, task-irrelevant mapping stimuli during inter-trial intervals evoked fewer spikes without impairing stimulus encoding, indicating that attentional modulation generalized beyond training stimuli. Importantly, spiking reductions predicted trial-to-trial behavioral accuracy during auditory attention, but not visual attention. Together, these findings suggest auditory attention facilitates sound discrimination by filtering sound-irrelevant background activity in AC, and that the deepest cortical layers serve as a hub for integrating extramodal contextual information.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据