12/07/2020

Efficiently sampling functions from Gaussian process posteriors

James Wilson, Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Deisenroth

Keywords: Gaussian Processes

Abstract: Gaussian processes are the gold standard for many real-world modeling problems, especially in cases where a method's success hinges upon its ability to faithfully represent predictive uncertainty. These problems typically exist as parts of larger frameworks, where quantities of interest are ultimately defined by integrating over posterior distributions. However, these algorithms' inner workings rarely allow for closed-form integration, giving rise to a need for Monte Carlo methods. Despite substantial progress in scaling up Gaussian processes to large training sets, methods for accurately generating draws from their posterior distributions still scale cubically in the number of test locations. We identify a factorization of Gaussian processes that naturally lends itself to efficient sampling, by allowing accurate representation of entire function draws. Building off of this factorization, we propose decoupled sampling, an easy-to-use and general-purpose approach for fast posterior sampling. As a drop-in approach to sampling, decoupled sampling seamlessly pairs with sparse approximations to Gaussian processes to afford scalability both during training and at test time. In a series of experiments designed to test sampling schemes' statistical behavior and practical ramifications, we empirically show that functions drawn using decoupled sampling faithfully represent Gaussian process posteriors at a fraction of the cost.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers