CaltechAUTHORS
  A Caltech Library Service

QuakeFlow: a scalable machine-learning-based earthquake monitoring workflow with cloud computing

Zhu, Weiqiang and Hou, Alvin Brian and Yang, Robert and Datta, Avoy and Mousavi, S. Mostafa and Ellsworth, William L. and Beroza, Gregory C. (2022) QuakeFlow: a scalable machine-learning-based earthquake monitoring workflow with cloud computing. Geophysical Journal International, 232 (1). pp. 684-693. ISSN 0956-540X. doi:10.1093/gji/ggac355. https://resolver.caltech.edu/CaltechAUTHORS:20221017-12657600.21

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20221017-12657600.21

Abstract

Earthquake monitoring workflows are designed to detect earthquake signals and to determine source characteristics from continuous waveform data. Recent developments in deep learning seismology have been used to improve tasks within earthquake monitoring workflows that allow the fast and accurate detection of up to orders of magnitude more small events than are present in conventional catalogues. To facilitate the application of machine-learning algorithms to large-volume seismic records at scale, we developed a cloud-based earthquake monitoring workflow, QuakeFlow, which applies multiple processing steps to generate earthquake catalogues from raw seismic data. QuakeFlow uses a deep learning model, PhaseNet, for picking P/S phases and a machine learning model, GaMMA, for phase association with approximate earthquake location and magnitude. Each component in QuakeFlow is containerized, allowing straightforward updates to the pipeline with new deep learning/machine learning models, as well as the ability to add new components, such as earthquake relocation algorithms. We built QuakeFlow in Kubernetes to make it auto-scale for large data sets and to make it easy to deploy on cloud platforms, which enables large-scale parallel processing. We used QuakeFlow to process three years of continuous archived data from Puerto Rico within a few hours, and found more than a factor of ten more events that occurred on much the same structures as previously known seismicity. We applied Quakeflow to monitoring earthquakes in Hawaii and found over an order of magnitude more events than are in the standard catalogue, including many events that illuminate the deep structure of the magmatic system. We also added Kafka and Spark streaming to deliver real-time earthquake monitoring results. QuakeFlow is an effective and efficient approach both for improving real-time earthquake monitoring and for mining archived seismic data sets.


Item Type:Article
Related URLs:
URLURL TypeDescription
https://doi.org/10.1093/gji/ggac355DOIArticle
ORCID:
AuthorORCID
Beroza, Gregory C.0000-0002-8667-1838
Additional Information:We thank Martijn van den Ende, Jannes Münchmeyer and Margarita Segou for their constructive reviews and suggestions. We thank Miao Zhang, Yongsoo Park and Ian McBrearty for helpful discussions. The facilities of IRIS Data Services, and specifically the IRIS Data Management Center, were used for access to waveforms, related metadata and/or derived products used in this study. This work was supported by AFRL under contract number FA9453-19-C-0073.
Group:Seismological Laboratory
Funders:
Funding AgencyGrant Number
Air Force Research Laboratory (AFRL)FA9453-19-C-0073
Issue or Number:1
DOI:10.1093/gji/ggac355
Record Number:CaltechAUTHORS:20221017-12657600.21
Persistent URL:https://resolver.caltech.edu/CaltechAUTHORS:20221017-12657600.21
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:117460
Collection:CaltechAUTHORS
Deposited By: Research Services Depository
Deposited On:21 Oct 2022 01:18
Last Modified:25 Oct 2022 19:21

Repository Staff Only: item control page