Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 2020 | Published + Submitted
Journal Article Open

Massively parallel Bayesian inference for transient gravitational-wave astronomy


Understanding the properties of transient gravitational waves (GWs) and their sources is of broad interest in physics and astronomy. Bayesian inference is the standard framework for astrophysical measurement in transient GW astronomy. Usually, stochastic sampling algorithms are used to estimate posterior probability distributions over the parameter spaces of models describing experimental data. The most physically accurate models typically come with a large computational overhead which can render data analsis extremely time consuming, or possibly even prohibitive. In some cases highly specialized optimizations can mitigate these issues, though they can be difficult to implement, as well as to generalize to arbitrary models of the data. Here, we investigate an accurate, flexible, and scalable method for astrophysical inference: parallelized nested sampling. The reduction in the wall-time of inference scales almost linearly with the number of parallel processes running on a high-performance computing cluster. By utilizing a pool of several hundreds or thousands of CPUs in a high-performance cluster, the large wall times of many astrophysical inferences can be alleviated while simultaneously ensuring that any GW signal model can be used 'out of the box', i.e. without additional optimization or approximation. Our method will be useful to both the LIGO-Virgo-KAGRA collaborations and the wider scientific community performing astrophysical analyses on GWs. An implementation is available in the open source gravitational-wave inference library pBilby (parallel bilby).

Additional Information

© 2020 The Author(s) Published by Oxford University Press on behalf of the Royal Astronomical Society. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model). Accepted 2020 August 13. Received 2020 August 4; in original form 2020 April 26. This work is supported through Australian Research Council (ARC) Centre of Excellence CE170100004. The analyses presented in this paper were performed using the supercomputer cluster at the Swinburne University of Technology (SSTAR). This document has LIGO Document number P1900255-v1. We would like to thank Mathew Pitkin, Roberto Cotesta, Simon Stevenson, Serguei Ossokine, and Scott Coughlin for extensive testing of pBilby, and Eve Chase for providing information about the GWTC-1 analyses performed using SEOBNRv3. Additional thanks to Colin Capano for reviewing this manuscript. Thanks also to the SSTAR system admins for their support with all things MPI and for their patience. We are grateful for insightful comments from Vivien Raymond, Eve Chase, Richard O'Shaughnessy, Moritz Hubner, Michele Vallisneri, Alessandra Buonanno, Vicky Kalogera, and the LIGO-Virgo Parameter Estimation and Coalescing Compact Binary working groups. Additional thanks to Joshua Speagle for pointing out the scaling relation for parallel nested sampling. This research has made use of data, software and/or web tools obtained from the Gravitational Wave Open Science Center (https://www.gw-openscience.org), a service of LIGO Laboratory, the LIGO Scientific Collaboration and the Virgo Collaboration. LIGO is funded by the U.S. National Science Foundation. Virgo is funded by the French Centre National de Recherche Scientifique (CNRS), the Italian Istituto Nazionale della Fisica Nucleare (INFN) and the Dutch Nikhef, with contributions by Polish and Hungarian institutes. DATA AVAILABILITY. The pBilby software library is available at https://git.ligo.org/lscsoft/parallel_bilby/.

Attached Files

Published - staa2483.pdf

Submitted - 1909.11873.pdf


Files (1.6 MB)
Name Size Download all
1.0 MB Preview Download
627.5 kB Preview Download

Additional details

August 20, 2023
October 23, 2023