A Caltech Library Service

Surgical gestures as a method to quantify surgical performance and predict patient outcomes

Ma, Runzhuo and Ramaswamy, Ashwin and Xu, Jiashu and Trinh, Loc and Kiyasseh, Dani and Chu, Timothy N. and Wong, Elyssa Y. and Lee, Ryan S. and Rodriguez, Ivan and DeMeo, Gina and Desai, Aditya and Otiato, Maxwell X. and Roberts, Sidney I. and Nguyen, Jessica H. and Laca, Jasper and Liu, Yan and Urbanova, Katarina and Wagner, Christian and Anandkumar, Animashree and Hu, Jim C. and Hung, Andrew J. (2022) Surgical gestures as a method to quantify surgical performance and predict patient outcomes. npj Digital Medicine, 5 . Art. No. 187. ISSN 2398-6352. PMCID PMC9780308. doi:10.1038/s41746-022-00738-y.

[img] PDF - Published Version
Creative Commons Attribution.


Use this Persistent URL to link to this item:


How well a surgery is performed impacts a patient’s outcomes; however, objective quantification of performance remains an unsolved challenge. Deconstructing a procedure into discrete instrument-tissue “gestures” is a emerging way to understand surgery. To establish this paradigm in a procedure where performance is the most important factor for patient outcomes, we identify 34,323 individual gestures performed in 80 nerve-sparing robot-assisted radical prostatectomies from two international medical centers. Gestures are classified into nine distinct dissection gestures (e.g., hot cut) and four supporting gestures (e.g., retraction). Our primary outcome is to identify factors impacting a patient’s 1-year erectile function (EF) recovery after radical prostatectomy. We find that less use of hot cut and more use of peel/push are statistically associated with better chance of 1-year EF recovery. Our results also show interactions between surgeon experience and gesture types—similar gesture selection resulted in different EF recovery rates dependent on surgeon experience. To further validate this framework, two teams independently constructe distinct machine learning models using gesture sequences vs. traditional clinical features to predict 1-year EF. In both models, gesture sequences are able to better predict 1-year EF (Team 1: AUC 0.77, 95% CI 0.73–0.81; Team 2: AUC 0.68, 95% CI 0.66–0.70) than traditional clinical features (Team 1: AUC 0.69, 95% CI 0.65–0.73; Team 2: AUC 0.65, 95% CI 0.62–0.68). Our results suggest that gestures provide a granular method to objectively indicate surgical performance and outcomes. Application of this methodology to other surgeries may lead to discoveries on methods to improve surgery.

Item Type:Article
Related URLs:
URLURL TypeDescription CentralArticle
Ma, Runzhuo0000-0001-6381-2661
Ramaswamy, Ashwin0000-0002-8816-7838
Xu, Jiashu0000-0003-4093-2315
Kiyasseh, Dani0000-0002-2898-1790
Otiato, Maxwell X.0000-0001-6979-6316
Nguyen, Jessica H.0000-0003-0454-8463
Liu, Yan0000-0002-5837-4908
Anandkumar, Animashree0000-0002-6974-6797
Hu, Jim C.0000-0003-2562-8024
Hung, Andrew J.0000-0002-7201-6736
Additional Information:This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit This study was supported in part by the National Cancer Institute under Award No. R01CA273031. Contributions. A.J.H. conceived of the study. A.J.H. and J.C.H. obtained the funding. A.J.H., R.M., J.L., J.H.N., and C.W. designed and provided oversight for the administration and implementation of the study. R.M., T.N.C., I.R., G.D., A.D., M.X.O., K.U., S.I.R., and C.W. collected the data and annotataed the surgical videos. R.M., J.X., L.T., and D.K. performed the data analysis and visualization. A.A. and Y.L. provided data analysis guidance and supervision. R.M., A.R., and R.S.L. wrote the draft of the manuscript. Data availability. The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request. Code availability. The code of this article can be found by: Competing interests. C.W. declares no competing non-financial interests but reports financial disclosures with Intuitive Surgical, Inc. A.A. declares no competing non-financial interests but is a paid employee of Nvidia. J.C.H. declares no competing non-financial interests but the following competing financial interests: salary support from the Frederick J. and Theresa Dow Wallace Fund of the New York and from Prostate Cancer Foundation Challenge Award. Also salary support from NIH R01 CA241758 and R01 CA259173, PCORI CER-2019C1-15682 and CER-2019C2-17372. A.J.H. declares no competing non-financial interests but reports financial disclosures with Intuitive Surgical, Inc. The remaining authors declare no competing interests.
Funding AgencyGrant Number
PubMed Central ID:PMC9780308
Record Number:CaltechAUTHORS:20230209-988069100.14
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:119179
Deposited By: Research Services Depository
Deposited On:15 Mar 2023 23:42
Last Modified:23 May 2023 21:37

Repository Staff Only: item control page