Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 2022 | public
Journal Article

Data-Driven Synthesis of Broadband Earthquake Ground Motions Using Artificial Intelligence


Robust estimation of ground motions generated by scenario earthquakes is critical for many engineering applications. We leverage recent advances in generative adversarial networks (GANs) to develop a new framework for synthesizing earthquake acceleration time histories. Our approach extends the Wasserstein GAN formulation to allow for the generation of ground motions conditioned on a set of continuous physical variables. Our model is trained to approximate the intrinsic probability distribution of a massive set of strong-motion recordings from Japan. We show that the trained generator model can synthesize realistic three-component accelerograms conditioned on magnitude, distance, and V_(S30). Our model captures most of the relevant statistical features of the acceleration spectra and waveform envelopes. The output seismograms display clear P- and S-wave arrivals with the appropriate energy content and relative onset timing. The synthesized peak ground acceleration estimates are also consistent with observations. We develop a set of metrics that allow us to assess the training process's stability and to tune model hyperparameters. We further show that the trained generator network can interpolate to conditions in which no earthquake ground-motion recordings exist. Our approach allows for the on-demand synthesis of accelerograms for engineering purposes.

Additional Information

This research was partially supported by the U.S. Geological Survey/National Earthquake Hazards Reduction Program (USGS/NEHRP) Grant G19AP00035 and by the Southern California Earthquake Center (SCEC). SCEC is funded by the National Science Foundation (NSF) Cooperative Agreement EAR‐1600087 and USGS Cooperative Agreement G17AC00047. The authors would like to thank Egill Hauksson for helpful discussions. The authors also would like to thank Fabrice Cotton and Mostafa Mousavi for their comments and suggestions.

Additional details

August 22, 2023
October 24, 2023