A Caltech Library Service

Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

Schirmer, Annett and Adolphs, Ralph (2017) Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence. Trends in Cognitive Sciences, 21 (3). pp. 216-228. ISSN 1364-6613. PMCID PMC5334135. doi:10.1016/j.tics.2017.01.001.

[img] PDF - Accepted Version
See Usage Policy.


Use this Persistent URL to link to this item:


Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly nonoverlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments.

Item Type:Article
Related URLs:
URLURL TypeDescription CentralArticle
Adolphs, Ralph0000-0002-8053-9692
Additional Information:© 2017 Elsevier Ltd. Available online 4 February 2017.
Issue or Number:3
PubMed Central ID:PMC5334135
Record Number:CaltechAUTHORS:20170213-075959802
Persistent URL:
Official Citation:Annett Schirmer, Ralph Adolphs, Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence, Trends in Cognitive Sciences, Volume 21, Issue 3, March 2017, Pages 216-228, ISSN 1364-6613, (
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:74224
Deposited By: Ruth Sustaita
Deposited On:13 Feb 2017 17:13
Last Modified:11 Nov 2021 05:25

Repository Staff Only: item control page