14/09/2020

An algorithmic framework for decentralised matrix factorisation

Erika Duriakova, Weipeng Huang, Elias Tragos, Aonghus Lawlor, Barry Smyth, James Geraci, Neil Hurley

Keywords: recommender systems, distributed learning, decentralised matrix factorisation, latent factor models, matrix factorisation, communication efficiency, convergence proof

Abstract: We propose a framework for fully decentralised machine learning and apply it to latent factor models for top-N recommendation. The training data in a decentralised learning setting is distributed across multiple agents, who jointly optimise a common global objective function (the loss function). Here, in contrast to the client-server architecture of federated learning, the agents communicate directly, maintaining and updating their own model parameters, without central aggregation and without sharing their own data. This framework involves two key contributions. Firstly, we propose a method to extend a global loss function to a distributed loss function over the distributed parameters of the decentralised system; secondly, we show how this distributed loss function can be optimised using an algorithm that operates in two phases. In the learning phase, a large number of steps of local learning are carried out by each agent without communication. In a following sharing phase, neighbouring agents exchange messages that enable a batch update of local parameters. Thus, unlike other decentralised algorithms that require some inter-agent communication after one (or a few) model updates, our algorithm significantly reduces the number of messages that need to be exchanged during learning. We prove the convergence of our framework and demonstrate its effectiveness using both the Weighted Matrix Factorisation and Bayesian Personalised Ranking latent factor recommender models. We demonstrate empirically the performance of our approach on a number of different recommender system datasets.

 0
 0
 0
 1
This is an embedded video. Talk and the respective paper are published at ECML PKDD 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers