4.7 Article

Flies and humans share a motion estimation strategy that exploits natural scene statistics

Journal

NATURE NEUROSCIENCE
Volume 17, Issue 2, Pages 296-303

Publisher

NATURE PUBLISHING GROUP
DOI: 10.1038/nn.3600

Keywords

-

Categories

Funding

  1. US National Institutes of Health T32 Vision Research Training
  2. Jane Coffin Childs Foundation
  3. National Science Foundation Graduate Research Fellowship
  4. US National Institutes of Health [EY015790]
  5. US National Institutes of Health
  6. US National Institutes of Health Director's Pioneer Award [DP1 OD003530]
  7. [NSF-0801700]
  8. [R01 EY022638]

Ask authors/readers for more resources

Sighted animals extract motion information from visual scenes by processing spatiotemporal patterns of light falling on the retina. The dominant models for motion estimation exploit intensity correlations only between pairs of points in space and time. Moving natural scenes, however, contain more complex correlations. We found that fly and human visual systems encode the combined direction and contrast polarity of moving edges using triple correlations that enhance motion estimation in natural environments. Both species extracted triple correlations with neural substrates tuned for light or dark edges, and sensitivity to specific triple correlations was retained even as light and dark edge motion signals were combined. Thus, both species separately process light and dark image contrasts to capture motion signatures that can improve estimation accuracy. This convergence argues that statistical structures in natural scenes have greatly affected visual processing, driving a common computational strategy over 500 million years of evolution.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available