19/10/2020

Transformer models for recommending related questions in web search

Rajarshee Mitra, Manish Gupta, Sandipan Dandapat

Keywords: context embedding, transformers, related question, deep learning, snippets, bert, people also ask

Abstract: People Also Ask (PAA) is an exciting feature in most of the leading search engines which recommends related questions for a given user query, thereby attempting to reduce the gap between user’s information need. This helps users in diving deep into the topic of interest, and reduces task completion time. However, showing unrelated or irrelevant questions is highly detrimental to the user experience. While there has been significant work on query reformulation and related searches, there is hardly any published work on recommending related questions for a query. Question suggestion is challenging because the question needs to be interesting, structurally correct, not be a duplicate of other visible information, and must be reasonably related to the original query. In this paper, we present our system which is based on a Transformer-based neural representation, BERT (Bidirectional Encoder Representations from Transformers), for query, question and corresponding search result snippets. Our best model provides an accuracy of  81

The video of this talk cannot be embedded. You can watch it here:
https://dl.acm.org/doi/10.1145/3340531.3412067#sec-supp
(Link will open in new window)
 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at CIKM 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers