Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 2021 | Supplemental Material + Submitted + Published
Journal Article Open

Minimax Model Learning


We present a novel off-policy loss function for learning a transition model in model-based reinforcement learning. Notably, our loss is derived from the off-policy policy evaluation objective with an emphasis on correcting distribution shift. Compared to previous model-based techniques, our approach allows for greater robustness under model misspecification or distribution shift induced by learning/evaluating policies that are distinct from the data-generating policy. We provide a theoretical analysis and show empirical improvements over existing model-based off-policy evaluation methods. We provide further analysis showing our loss can be used for off-policy optimization (OPO) and demonstrate its integration with more recent improvements in OPO.

Additional Information

© 2021 by the author(s). Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS) 2021, San Diego, California, USA. PMLR: Volume 130. Cameron Voloshin is supported in part by a Kortschak Fellowship. This work is also supported in part by NSF # 1645832, NSF # 1918839, and funding from Beyond Limits. Nan Jiang is sponsored in part by the DEVCOM Army Research Laboratory under Cooperative Agreement W911NF-17-2-0196 (ARL IoBT CRA). The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.

Attached Files

Published - voloshin21a.pdf

Submitted - 2103.02084.pdf

Supplemental Material - voloshin21a-supp.pdf


Files (2.9 MB)
Name Size Download all
1.4 MB Preview Download
441.8 kB Preview Download
1.0 MB Preview Download

Additional details

August 20, 2023
October 23, 2023