12/07/2020

Non-separable Non-stationary random fields

Kangrui Wang, Oliver Hamelijnck, Theodoros Damoulas, Mark Steel

Keywords: General Machine Learning Techniques

Abstract: We describe a framework for constructing non-separable non-stationary random fields that is based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary and the convolution function is non-stationary we arrive at expressive kernels that are available in closed form. When the mixing is non-stationary and the convolution function is stationary the resulting random fields exhibit varying degrees of non-separability that better preserve local structure. These kernels have natural interpretations through corresponding stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these kernels can computationally and statistically outperform both separable and existing non-stationary non-separable approaches such as treed GPs and deep GP constructions.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers