Smartphone‐based gaze estimation for in‐home autism research
- 1. California Institute of Technology
- 2. 0000-0002-3128-8313
Abstract
Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.
Copyright and License
© 2024 The Authors. Autism Research published by International Society for Autism Research and Wiley Periodicals LLC. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
Acknowledgement
We would like to thank Dr. Umit Keles for helpful advice on automatic feature segmentation and data analysis, members of the Emotion and Social Cognition Laboratory at Caltech for helpful discussions, and all our participants and their families for their participation. This study was funded in part by grants from the Della Martin Foundation, the Simons Foundation Autism Research Initiative, and Google.
Data Availability
To protect participants'’ privacy, captured full face image data will not be publicly available. The de-identified gaze estimates (x- and y-coordinates of estimated gaze on screen) and video stimuli are available upon reasonable request. Implementation details of the gaze estimation model are available in Valliappan et al., (2020). Preprocessing and analyses of gaze data and automatic feature segmentation were performed using custom MATLAB and Python scripts, which are available and can be accessed via https://github.com/nayeonckim/smartphoneET_autism_analysis.
Conflict of Interest
J.H., K.K., and V.N. are employees of Google. N.D. was at Google when the study was designed and conducted. All other authors declare no competing interests.
Files
Name | Size | Download all |
---|---|---|
md5:b1013ca4f7c088954b1a55279ace33d2
|
1.1 MB | Download |
md5:ff5c6b67f7cbd845f895fe63758a4ed8
|
850.5 kB | Preview Download |
Additional details
- ISSN
- 1939-3806
- Della Martin Foundation
- Simons Foundation
- Google (United States)
- Caltech groups
- Tianqiao and Chrissy Chen Institute for Neuroscience, Division of Biology and Biological Engineering