Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 2019 | Supplemental Material + Published + Submitted
Journal Article Open

A General Framework for Multi-fidelity Bayesian Optimization with Gaussian Processes


How can we efficiently gather information to optimize an unknown function, when presented with multiple, mutually dependent information sources with different costs? For example, when optimizing a physical system, intelligently trading off computer simulations and real-world tests can lead to significant savings. Existing multi-fidelity Bayesian optimization methods, such as multi-fidelity GP-UCB or Entropy Search-based approaches, either make simplistic assumptions on the interaction among different fidelities or use simple heuristics that lack theoretical guarantees. In this paper, we study multi-fidelity Bayesian optimization with complex structural dependencies among multiple outputs, and propose MF-MI-Greedy, a principled algorithmic framework for addressing this problem. In particular, we model different fidelities using additive Gaussian processes based on shared latent relationships with the target function. Then we use cost-sensitive mutual information gain for efficient Bayesian optimization. We propose a simple notion of regret which incorporates the varying cost of different fidelities, and prove that MF-MI-Greedy achieves low regret. We demonstrate the strong empirical performance of our algorithm on both synthetic and real-world datasets.

Additional Information

© 2019 by the author(s). This work was supported in part by NSF Award #1645832, Northrop Grumman, Bloomberg, Raytheon, PIMCO, and a Swiss NSF Early Mobility Postdoctoral Fellowship.

Attached Files

Published - song19b.pdf

Submitted - 1811.00755.pdf

Supplemental Material - song19b-supp.pdf


Files (1.8 MB)
Name Size Download all
224.2 kB Preview Download
747.4 kB Preview Download
873.4 kB Preview Download

Additional details

August 19, 2023
October 20, 2023