Image Imputation with conditional generative adversarial networks captures clinically relevant imaging features on computed tomography
- Editor:
- Xu, Jie
Abstract
Kidney cancer is among the top 10 most common malignancies in adults, and is commonly evaluated with four-phase computed tomography (CT) imaging. However, the presence of missing or corrupted images remains a significant problem in medical imaging that impairs the detection, diagnosis, and treatment planning of kidney cancer. Deep learning approaches through conditional generative adversarial networks (cGANs) have recently shown technical promise in the task of imputing missing imaging data from these four-phase studies. In this study, we explored the clinical utility of these imputed images. We utilized a cGAN trained on 333 patients, with the task of the cGAN being to impute the image of any phase given the other three phases. We tested the clinical utility on the imputed images of the 37 patients in the test set by manually extracting 21 clinically relevant imaging features and comparing them to their ground truth counterpart. All 13 categorical clinical features had greater than 85% agreement rate between true images and their imputed counterparts. This high accuracy is maintained when stratifying across imaging phases. Imputed images also show good agreement with true images in select radiomic features including mean intensity and enhancement. Imputed images possess the features characteristic of benign or malignant diagnosis at an equivalent rate to true images. In conclusion, imputed images from cGANs have large potential for clinical use due to their ability to retain clinically relevant qualitative and quantitative features.
Copyright and License
© 2025 Rich et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding
Author(s) initials: V.D. Grant number: W911F2010050 Funder: Army Research Office (ARO) url: https://www.grants.gov/. Sponsor role: No Author(s) initials: R.R. Grant number: N/A Funder: Ming-Hsieh Institute url: https://mhicancer.usc.edu. Sponsor role: No.
Data Availability
The annotated features are available in the supplementary material. The raw radiologic imaging data is not available at this time due to IRB regulations and privacy considerations.
Conflict of Interest
I have read the journal's policy and the authors of this manuscript have the following competing interests: 1. Vinay Duddalwar is a consultant to Radmetrix, Roche, and Deeptek. The authors have no other relevant financial or non-financial interests to disclose.
Supplemental Material
S1 Fig. Heatmap of agreement for each categorical feature by each pair of real-imputed images.
Light blue = feature agreement; dark blue = feature disagreement. (TIFF)
S2 Fig. A representative pair of images with above-average Hallucination Index.
Red outline = tumor boundary. (TIFF)
S1 Table. Analysis of patient image information across the clinical features of interest for the ground truth and cGAN-generated images. (XLSX)
Files
Additional details
- PMID
- 40802824
- PMCID
- PMC12349720
- United States Army Research Office
- W911F2010050
- Ming-Hsieh Institute
- None
- Caltech groups
- Division of Biology and Biological Engineering (BBE)
- Publication Status
- Published