4.3 Article

Vision and touch are automatically integrated for the perception of sequences of events

Journal

JOURNAL OF VISION
Volume 6, Issue 5, Pages 554-564

Publisher

ASSOC RESEARCH VISION OPHTHALMOLOGY INC
DOI: 10.1167/6.5.2

Keywords

tactile; visual; multimodal interaction; illusions; sensory systems; Bayesian integration

Categories

Ask authors/readers for more resources

The purpose of the present experiment was to investigate the integration of sequences of visual and tactile events. Subjects were presented with sequences of visual. ashes and tactile taps simultaneously and instructed to count either the. ashes ( Session 1) or the taps ( Session 2). The number of. ashes could differ from the number of taps by +/-1. For both sessions, the perceived number of events was significantly influenced by the number of events presented in the task-irrelevant modality. Touch had a stronger influence on vision than vision on touch. Interestingly, touch was the more reliable of the two modalities-less variable estimates when presented alone. For both sessions, the perceptual estimates were less variable when stimuli were presented in both modalities than when the task-relevant modality was presented alone. These results indicate that even when one signal is explicitly task irrelevant, sensory information tends to be automatically integrated across modalities. They also suggest that the relative weight of each sensory channel in the integration process depends on its relative reliability. The results are described using a Bayesian probabilistic model for multimodal integration that accounts for the coupling between the sensory estimates.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available