4.8 Article

Dynamics of visual information integration in the brain for categorizing facial expressions

Journal

CURRENT BIOLOGY
Volume 17, Issue 18, Pages 1580-1585

Publisher

CELL PRESS
DOI: 10.1016/j.cub.2007.08.048

Keywords

-

Ask authors/readers for more resources

A key to understanding visual cognition istodetermine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1-16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with fear being faster than disgust, itself fasterthan happy). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available