A Caltech Library Service

CoverNet: Multimodal Behavior Prediction Using Trajectory Sets

Phan-Minh, Tung and Grigore, Elena Corina and Boulton, Freddy A. and Beijbom, Oscar and Wolff, Eric M. (2020) CoverNet: Multimodal Behavior Prediction Using Trajectory Sets. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE , Piscataway, NJ, pp. 14062-14071. ISBN 9781728171685.

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item:


We present CoverNet, a new method for multimodal, probabilistic trajectory prediction for urban driving. Previous work has employed a variety of methods, including multimodal regression, occupancy maps, and 1-step stochastic policies. We instead frame the trajectory prediction problem as classification over a diverse set of trajectories. The size of this set remains manageable due to the limited number of distinct actions that can be taken over a reasonable prediction horizon. We structure the trajectory set to a) ensure a desired level of coverage of the state space, and b) eliminate physically impossible trajectories. By dynamically generating trajectory sets based on the agent's current state, we can further improve our method's efficiency. We demonstrate our approach on public, real world self-driving datasets, and show that it outperforms state-of-the-art methods.

Item Type:Book Section
Related URLs:
URLURL TypeDescription
Additional Information:© 2020 IEEE. Work done during an internship at nuTonomy, an Aptiv company. We would like to thank Emilio Frazzoli and Sourabh Vora for insightful discussions, and Robert Beaudoin for help on the implementation.
Record Number:CaltechAUTHORS:20200806-153947718
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:104784
Deposited By: George Porter
Deposited On:10 Aug 2020 16:42
Last Modified:16 Nov 2021 18:35

Repository Staff Only: item control page