4.4 Article

Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus

Journal

JOURNAL OF NEUROPHYSIOLOGY
Volume 94, Issue 4, Pages 2331-2352

Publisher

AMER PHYSIOLOGICAL SOC
DOI: 10.1152/jn.00021.2005

Keywords

-

Funding

  1. NINDS NIH HHS [NS-50942, NS-17778] Funding Source: Medline

Ask authors/readers for more resources

The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas ( LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be modulated by the locations of visual targets, 19% by auditory targets, and 9% by both visual and auditory targets. The reference frame for both visual and auditory receptive fields ranged along a continuum between eye- and head- centered reference frames with similar to 10% of auditory and 33% of visual neurons having receptive fields that were more consistent with an eye- than a head- centered frame of reference and 23 and 18% having receptive fields that were more consistent with a head-than an eye- centered frame of reference, leaving a large fraction of both visual and auditory response patterns inconsistent with both head- and eye- centered reference frames. The results were similar to the reference frame we have previously found for auditory stimuli in the inferior colliculus and core auditory cortex. The correspondence between the visual and auditory receptive fields of individual neurons was weak. Nevertheless, the visual and auditory responses were sufficiently well correlated that a simple one- layer network constructed to calculate target location from the activity of the neurons in our sample performed successfully for auditory targets even though the weights were fit based only on the visual responses. We interpret these results as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available