Published March 2017 | Version Accepted Version
Journal Article Open

Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

  • 1. ROR icon Chinese University of Hong Kong
  • 2. ROR icon Max Planck Institute for Human Cognitive and Brain Sciences
  • 3. ROR icon National University of Singapore
  • 4. ROR icon California Institute of Technology

Abstract

Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly nonoverlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments.

Additional Information

© 2017 Elsevier Ltd. Available online 4 February 2017.

Attached Files

Accepted Version - nihms842467.pdf

Files

nihms842467.pdf

Files (1.2 MB)

Name Size Download all
md5:723318cb4c8fe4ad30565016ba3ebdc5
1.2 MB Preview Download

Additional details

Identifiers

PMCID
PMC5334135
Eprint ID
74224
Resolver ID
CaltechAUTHORS:20170213-075959802

Dates

Created
2017-02-13
Created from EPrint's datestamp field
Updated
2021-11-11
Created from EPrint's last_modified field