25/07/2020

Deep interest with hierarchical attention network for click-through rate prediction

Weinan Xu, Hengxu He, Minshi Tan, Yunming Li, Jun Lang, Dongbai Guo

Keywords: hierarchical pattern, recommendation, click-through rate prediction, hierarchical attention network

Abstract: Deep Interest Network (DIN) is a state-of-the-art model which uses attention mechanism to capture user interests from historical behaviors. User interests intuitively follow a hierarchical pattern such that users generally show interests from a higher-level then to a lower-level abstraction. Modelling such interest hierarchy in an attention network can fundamentally improve the representation of user behaviors. We therefore propose an improvement over DIN to model arbitrary interest hierarchy: Deep Interest with Hierarchical Attention Network (DHAN). In this model, a multi-dimensional hierarchical structure is introduced on the first attention layer which attends to individual item, and the subsequent attention layers in the same dimension attend to higher-level hierarchy built on top of the lower corresponding layers. To enable modelling of multiple dimensional hierarchy, an expanding mechanism is introduced to capture one to many hierarchies. This design enables DHAN to attend different importance to different hierarchical abstractions thus can fully capture a user’s interests at different dimensions (e.g. category, price or brand). To validate our model, a simplified DHAN is applied to Click-Through Rate (CTR) prediction and our experimental results on three public datasets with two levels of one-dimensional hierarchy only by category. It shows DHAN’s superiority with significant AUC uplift from 12

The video of this talk cannot be embedded. You can watch it here:
https://dl.acm.org/doi/10.1145/3397271.3401310#sec-supp
(Link will open in new window)
 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at SIGIR 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers