A Caltech Library Service

Ice Thickness From Deep Learning and Conditional Random Fields: Application to Ice-Penetrating Radar Data With Radiometric Validation

Liu-Schiaffini, Miguel and Ng, Gregory and Grima, Cyril and Young, Duncan (2022) Ice Thickness From Deep Learning and Conditional Random Fields: Application to Ice-Penetrating Radar Data With Radiometric Validation. IEEE Transactions on Geoscience and Remote Sensing, 60 . Art. No. 5119014. ISSN 0196-2892. doi:10.1109/tgrs.2022.3214147.

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item:


Identifying the location of the ice–bedrock interface of glaciers and ice sheets is crucial for a wide range of geophysical applications, such as searching for liquid water in basal regions and computing ice thickness to quantify ice sheet and glacier mass balance. Simple, record-by-record, approaches to detecting the bottom of the ice echo may be affected by spurious off-nadir noise that requires significant manual interaction to correct. In this article, we propose a deep learning model based on convolutional neural networks (CNNs) and continuous conditional random fields (CCRFs) to automate ice bed identification and better capture fine-grained basal detail. We deploy this approach on high-capability radar sounder (HiCARS) radargrams, and this is the first time deep learning methods have been applied to this dataset. Intuitively, our CNN captures the global geometry of the ice bed, while the CCRF adjusts the initial CNN outputs to better incorporate fine-scale spatial information into the final prediction. We also develop a coherent geophysical framework using three echo characters (along-track continuity, relative delay, and signal coherency) to compare our model’s outputs with those of a manually targeted approach. Our analysis suggests that our CNN + CCRF model is as suitable as the manual approach for radiometric applications, and it outperforms the manual technique in identifying the first continuous return, which is most often the near-nadir reflection. Thus, our approach is more universal than the current manual labeling methodology in the range of geophysical applications where it can be used, and it provides better confidence regarding the source of the basal return detected.

Item Type:Article
Related URLs:
URLURL TypeDescription
Liu-Schiaffini, Miguel0000-0001-9685-8383
Ng, Gregory0000-0002-6743-0211
Grima, Cyril0000-0001-7135-3055
Young, Duncan0000-0002-6866-8176
Additional Information:This work was supported by in part by the G. Unger Vetlesen Foundation and in part by NSF under Grant 1443690 and Grant 2127606 for data analysis, and in part by NSF under Grant 0230197, Grant 0733025, and Grant 1543452; in part by the Australian Antarctic Science Project 4346; in part by the National Aeronautics and Space Administration (NASA) Operation IceBridge; and in part by the Weston Family Foundation for data collection. This is UTIG contribution 3882. The authors would like to thank Anja Rutishauser, Dillon Buhl, and Jamin Greenbaum for helpful comments and Donald Blankenship for data access. The authors would also like to thank Mrinal Sen for access to his GPU system and the pickers (Erick Leuro, Hunter Danque, Neil Danque, Jeremy Krimmel, Kevin Carver, Alyssa Jones, Gail Muldoon, Sam Christian, Lauren Schwartz, and Kristian Chan) who contributed to forming our training and test sets by manually labeling the ice bed interface in radargrams. The authors used the program developed in [58] to create our CNN model architecture figures.
Funding AgencyGrant Number
G. Unger Vetlesen FoundationUNSPECIFIED
Australian Antarctic Science Project4346
Weston Family FoundationUNSPECIFIED
Record Number:CaltechAUTHORS:20221116-601553800.7
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:117888
Deposited By: Research Services Depository
Deposited On:29 Nov 2022 18:19
Last Modified:29 Nov 2022 18:19

Repository Staff Only: item control page