Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 24, 2022 | Supplemental Material + Published
Journal Article Open

Atypical gaze patterns in autistic adults are heterogeneous across but reliable within individuals


Background: Across behavioral studies, autistic individuals show greater variability than typically developing individuals. However, it remains unknown to what extent this variability arises from heterogeneity across individuals, or from unreliability within individuals. Here, we focus on eye tracking, which provides rich dependent measures that have been used extensively in studies of autism. Autistic individuals have an atypical gaze onto both static visual images and dynamic videos that could be leveraged for diagnostic purposes if the above open question could be addressed. Methods: We tested three competing hypotheses: (1) that gaze patterns of autistic individuals are less reliable or noisier than those of controls, (2) that atypical gaze patterns are individually reliable but heterogeneous across autistic individuals, or (3) that atypical gaze patterns are individually reliable and also homogeneous among autistic individuals. We collected desktop-based eye tracking data from two different full-length television sitcom episodes, at two independent sites (Caltech and Indiana University), in a total of over 150 adult participants (N = 48 autistic individuals with IQ in the normal range, 105 controls) and quantified gaze onto features of the videos using automated computer vision-based feature extraction. Results: We found support for the second of these hypotheses. Autistic people and controls showed equivalently reliable gaze onto specific features of videos, such as faces, so much so that individuals could be identified significantly above chance using a fingerprinting approach from video epochs as short as 2 min. However, classification of participants into diagnostic groups based on their eye tracking data failed to produce clear group classifications, due to heterogeneity in the autistic group. Limitations: Three limitations are the relatively small sample size, assessment across only two videos (from the same television series), and the absence of other dependent measures (e.g., neuroimaging or genetics) that might have revealed individual-level variability that was not evident with eye tracking. Future studies should expand to larger samples across longer longitudinal epochs, an aim that is now becoming feasible with Internet- and phone-based eye tracking. Conclusions: These findings pave the way for the investigation of autism subtypes, and for elucidating the specific visual features that best discriminate gaze patterns—directions that will also combine with and inform neuroimaging and genetic studies of this complex disorder.

Additional Information

© The Author(s) 2022. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. We are grateful to Tim Armstrong, Steven Lograsso, Susannah Ferguson, Brad Caron, and Arispa Weigold for help with data collection, David Kahn for comments on the manuscript, and all our participants and their families for their participation in this time-intensive study. This research was supported in part by grant R01MH110630 from NIMH (DPK/RA), the Simons Foundation Autism Research Initiative (RA), the Simons Foundation Collaboration on the Global Brain (542951; UK), and the Eagles Autism Foundation (DK). Daniel P. Kennedy and Ralph Adolphs are joint senior authors. Author contributions. DPK, LB, and RA designed the study. LKP and DPK recruited and assessed participants. DK, Caltech, and Indiana University personnel collected data. UK organized and preprocessed data, ensured data quality, and conducted all analyses. UK and RA drafted the paper and all co-authors provided extensive discussions and edits. All authors read and approved the final manuscript. Availability of data and materials. All data and code can be publicly accessed at https://github.com/adolphslab/adolphslab_eyetracking. Ethics approval and consent to participate. This study was approved by the institutional review board of the California Institute of Technology and all participants gave written informed consent. All participants gave consent for publication as part of their informed consent. The authors declare no competing interests.

Attached Files

Published - 13229_2022_Article_517.pdf

Supplemental Material - 13229_2022_517_MOESM1_ESM.docx


Files (2.0 MB)
Name Size Download all
1.4 MB Preview Download
636.1 kB Download

Additional details

August 22, 2023
December 22, 2023