4.7 Article

Acoustic facilitation of object movement detection during self-motion

期刊

出版社

ROYAL SOC
DOI: 10.1098/rspb.2010.2757

关键词

flow parsing; visual search; multisensory perception; visual motion; auditory motion

资金

  1. NIH [RO1NS064100]
  2. Ministerio de Ciencia e Innovacion [PSI2010-15426, CSD2007-00012]
  3. Comissionat per a Universitats i Recerca del DIUE [SRG2009-092]
  4. European Research Council [StG-2010 263145]
  5. Sound Field Laboratory at Sargent College, Boston University [P30 DC04663]
  6. ICREA Funding Source: Custom

向作者/读者索取更多资源

In humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据