Loading...

Attention-based skill translation models for expert finding

Fallahnejad, Z ; Sharif University of Technology | 2022

92 Viewed
  1. Type of Document: Article
  2. DOI: 10.1016/j.eswa.2021.116433
  3. Publisher: Elsevier Ltd , 2022
  4. Abstract:
  5. The growing popularity of community question answering websites can be seen by the growing number of users. Many methods are proposed to identify talented users in these communities, but many of them suffer from vocabulary mismatches. The solution to this problem can be found in translation approaches. The present paper proposes two translation methods for extracting more relevant translations. The proposed methods rely on the attention mechanism. The methods use multi-label classifiers that take each question as input and predict the skills related to the question. Using the attention mechanism, the model is able to focus on specific parts of the given input and predict the correct labels. The ultimate goal of these networks is to predict skills related to questions. Using word attention scores, we can find out how relevant a single word is to a particular skill. As a result of these attention scores, we obtain more relevant translations for each skill. We then use these translations to bridge the lexical gap and improve expert retrieval results. Extensive experiments on two large sub-collections of the StackOverflow dataset demonstrate that the proposed methods outperform the best baseline method by up to 14.11/% MAP improvement. © 2022 Elsevier Ltd
  6. Keywords:
  7. Translation models ; Classification (of information) ; Large dataset ; Semantics ; Attention mechanisms ; Baseline methods ; Community question answering ; Expert finding ; Semantic matching ; Single words ; Stackoverflow ; Subcollections ; Translation method ; Forecasting
  8. Source: Expert Systems with Applications ; Volume 193 , 2022 ; 09574174 (ISSN)
  9. URL: https://www.sciencedirect.com/science/article/abs/pii/S0957417421017206