A Caltech Library Service

EMPATH: A Neural Network that Categorizes Facial Expressions

Dailey, Matthew N. and Cottrell, Garrison W. and Padgett, Curtis and Adolphs, Ralph (2002) EMPATH: A Neural Network that Categorizes Facial Expressions. Journal of Cognitive Neuroscience, 14 (8). pp. 1158-1173. ISSN 0898-929X. doi:10.1162/089892902760807177.

PDF - Published Version
See Usage Policy.


Use this Persistent URL to link to this item:


There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain.

Item Type:Article
Related URLs:
URLURL TypeDescription
Adolphs, Ralph0000-0002-8053-9692
Additional Information:© 2002 The MIT Press. We thank Stevan Harnad, Bill Kristan, Paul Munro, Alice O’Toole, Terry Sejnowski, and Gary’s Unbelievable Research Unit (GURU) for helpful comments on previous versions of this manuscript. We also thank Andrew Young for permission to use the human data plotted in Figures 2, 3, 7a, and 8. We are also grateful to Paul Ekman for providing us with the Pictures of Facial Affect. This research was funded by NIMH grant MH57075 to GWC.
Funding AgencyGrant Number
National Institute of Mental Health (NIMH)UNSPECIFIED
Issue or Number:8
Record Number:CaltechAUTHORS:DAIjcn02
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:6983
Deposited By: Archive Administrator
Deposited On:04 Jan 2007
Last Modified:08 Nov 2021 20:38

Repository Staff Only: item control page