Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 25, 2024 | in press
Journal Article Open

Smartphone‐based gaze estimation for in‐home autism research

Abstract

Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.

Copyright and License

Acknowledgement

We would like to thank Dr. Umit Keles for helpful advice on automatic feature segmentation and data analysis, members of the Emotion and Social Cognition Laboratory at Caltech for helpful discussions, and all our participants and their families for their participation. This study was funded in part by grants from the Della Martin Foundation, the Simons Foundation Autism Research Initiative, and Google.

Data Availability

To protect participants'’ privacy, captured full face image data will not be publicly available. The de-identified gaze estimates (x- and y-coordinates of estimated gaze on screen) and video stimuli are available upon reasonable request. Implementation details of the gaze estimation model are available in Valliappan et al., (2020). Preprocessing and analyses of gaze data and automatic feature segmentation were performed using custom MATLAB and Python scripts, which are available and can be accessed via https://github.com/nayeonckim/smartphoneET_autism_analysis.

Data S1. Supporting Information

Conflict of Interest

J.H., K.K., and V.N. are employees of Google. N.D. was at Google when the study was designed and conducted. All other authors declare no competing interests.

Files

Autism Research - 2024 - Kim - Smartphone‐based gaze estimation for in‐home autism research.pdf
Files (1.9 MB)

Additional details

Created:
May 31, 2024
Modified:
May 31, 2024