Dailey, Matthew N. and Cottrell, Garrison W. and Padgett, Curtis and Adolphs, Ralph (2002) EMPATH: A Neural Network that Categorizes Facial Expressions. Journal of Cognitive Neuroscience, 14 (8). pp. 1158-1173. ISSN 0898-929X. http://resolver.caltech.edu/CaltechAUTHORS:DAIjcn02
- Published Version
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:DAIjcn02
There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain.
|Additional Information:||© 2002 The MIT Press. We thank Stevan Harnad, Bill Kristan, Paul Munro, Alice O’Toole, Terry Sejnowski, and Gary’s Unbelievable Research Unit (GURU) for helpful comments on previous versions of this manuscript. We also thank Andrew Young for permission to use the human data plotted in Figures 2, 3, 7a, and 8. We are also grateful to Paul Ekman for providing us with the Pictures of Facial Affect. This research was funded by NIMH grant MH57075 to GWC.|
|Usage Policy:||No commercial reproduction, distribution, display or performance rights in this work are provided.|
|Deposited By:||Archive Administrator|
|Deposited On:||04 Jan 2007|
|Last Modified:||08 Feb 2017 20:26|
Repository Staff Only: item control page