Abstract:
We formulate expectation propagation (EP), a state-of-the-art method for approximate Bayesian inference, as a nonlinear Kalman smoother, showing that it generalises a wide class of classical smoothing algorithms. Specifically we show how power EP recovers the Extended and Unscented Kalman smoothers, with the distinction between the two being the choice of method for performing moment matching. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and by allowing fractional updates. We combine these benefits with the computational efficiency of Kalman smoothing, and provide extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. The resulting schemes enable inference in Gaussian process models in linear time complexity in the number of data, making them ideal for large temporal and spatio-temporal scenarios. Our results show that an extension of the Extended Kalman filter in which the linearisations are iteratively refined via EP-style updates is both efficient and performant, whilst its ease of implementation makes it a convenient plug-and-play approach to many non-conjugate regression and classification problems.