12/07/2020

Model Fusion with Kullback--Leibler Divergence

Sebastian Claici, Mikhail Yurochkin, Soumya Ghosh, Justin Solomon

Keywords: Probabilistic Inference - Approximate, Monte Carlo, and Spectral Methods

Abstract: We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors, and proceeds in a simple assign-and-average approach. The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. The global components are then updated based on these assignments by their mean under a KL divergence. For exponential family variational distributions, our formulation leads to an efficient non-parametric algorithm for computing the fused model. Our algorithm is easy to describe and implement, efficient, and performs competitive with state-of-the-art when tested on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers