CaltechAUTHORS
  A Caltech Library Service

EMPATH: A Neural Network that Categorizes Facial Expressions

Dailey, Matthew N. and Cottrell, Garrison W. and Padgett, Curtis and Adolphs, Ralph (2002) EMPATH: A Neural Network that Categorizes Facial Expressions. Journal of Cognitive Neuroscience, 14 (8). pp. 1158-1173. ISSN 0898-929X. http://resolver.caltech.edu/CaltechAUTHORS:DAIjcn02

[img]
Preview
PDF
See Usage Policy.

441Kb

Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:DAIjcn02

Abstract

There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain.


Item Type:Article
Additional Information:© 2002 The MIT Press We thank Stevan Harnad, Bill Kristan, Paul Munro, Alice O’Toole, Terry Sejnowski, and Gary’s Unbelievable Research Unit (GURU) for helpful comments on previous versions of this manuscript. We also thank Andrew Young for permission to use the human data plotted in Figures 2, 3, 7a, and 8. We are also grateful to Paul Ekman for providing us with the Pictures of Facial Affect. This research was funded by NIMH grant MH57075 to GWC.
Record Number:CaltechAUTHORS:DAIjcn02
Persistent URL:http://resolver.caltech.edu/CaltechAUTHORS:DAIjcn02
Alternative URL:http://dx.doi.org/10.1162/089892902760807177
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:6983
Collection:CaltechAUTHORS
Deposited By: Archive Administrator
Deposited On:04 Jan 2007
Last Modified:26 Dec 2012 09:27

Repository Staff Only: item control page