4.7 Article

Psychophysical scaling reveals a unified theory of visual memory strength

期刊

NATURE HUMAN BEHAVIOUR
卷 4, 期 11, 页码 1156-+

出版社

NATURE RESEARCH
DOI: 10.1038/s41562-020-00938-0

关键词

-

资金

  1. NSF CAREER [BCS-1653457]

向作者/读者索取更多资源

Almost all models of visual memory implicitly assume that errors in mnemonic representations are linearly related to distance in stimulus space. Here we show that neither memory nor perception are appropriately scaled in stimulus space; instead, they are based on a transformed similarity representation that is nonlinearly related to stimulus space. This result calls into question a foundational assumption of extant models of visual working memory. Once psychophysical similarity is taken into account, aspects of memory that have been thought to demonstrate a fixed working memory capacity of around three or four items and to require fundamentally different representations-across different stimuli, tasks and types of memory-can be parsimoniously explained with a unitary signal detection framework. These results have substantial implications for the study of visual memory and lead to a substantial reinterpretation of the relationship between perception, working memory and long-term memory. Schurgin et al. propose a model of visual memory, arguing against a distinction between how many items are represented and how precisely they are represented, and in favour of a view based on continuous representations in noisy channels.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据