12/07/2020

Can autonomous vehicles identify, recover from, and adapt to distribution shifts?

Angelos Filos, Panagiotis Tigas, Rowan McAllister, Nicholas Rhinehart, Sergey Levine, Yarin Gal

Keywords: Trustworthy Machine Learning

Abstract: Out-of-distribution (OOD) driving scenarios are a common failure of learning agents at deployment, typically leading to arbitrary deductions and poorly-informed decisions. In principle, detection of and adaption to OOD scenes can mitigate their adverse effects. However, no benchmark evaluating OOD detection and adaption currently exists to compare methods. In this paper, we introduce an autonomous car novel-scene benchmark, \texttt{CARNOVEL}, to evaluate the robustness of driving agents to a suite of tasks involving distribution shift. We also highlight the limitations of current approaches to novel driving scenes and propose an epistemic uncertainty-aware planning method, called \emph{robust imitative planning} (RIP). Our method can detect and recover from some distribution shifts, reducing the overconfident but catastrophic extrapolations in out-of-training-distribution scenes. When the model's uncertainty quantification is insufficient to suggest a safe course of action by itself, it is used to query the driver for feedback, enabling sample-efficient online adaptation, a variant of our method we term \emph{adaptive robust imitative planning} (AdaRIP).

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers