Attention Over Self-Attention: Intention-Aware Re-Ranking With Dynamic Transformer Encoders for Recommendation

Page view(s)
78
Checked on Oct 16, 2024
Attention Over Self-Attention: Intention-Aware Re-Ranking With Dynamic Transformer Encoders for Recommendation
Title:
Attention Over Self-Attention: Intention-Aware Re-Ranking With Dynamic Transformer Encoders for Recommendation
Journal Title:
IEEE Transactions on Knowledge and Data Engineering
Publication Date:
30 September 2022
Citation:
Lin, Z., Zang, S., Wang, R., Sun, Z., Senthilnath, J., Xu, C., & Kwoh, C. K. (2022). Attention Over Self-Attention: Intention-Aware Re-Ranking With Dynamic Transformer Encoders for Recommendation. IEEE Transactions on Knowledge and Data Engineering, 1–12. https://doi.org/10.1109/tkde.2022.3208633
Abstract:
Re-ranking models refine item recommendation lists generated by the prior global ranking model, which have demonstrated their effectiveness in improving the recommendation quality. However, most existing re-ranking solutions only learn from implicit feedback with a shared prediction model, which regrettably ignore inter-item relationships under diverse user intentions. In this paper, we propose a novel Intention-aware Re-ranking Model with Dynamic Transformer Encoder (RAISE), aiming to perform user-specific prediction for each individual user based on her intentions. Specifically, we first propose to mine latent user intentions from text reviews with an intention discovering module (IDM). By differentiating the importance of review information with a co-attention network, the latent user intention can be explicitly modeled for each user-item pair. We then introduce a dynamic transformer encoder (DTE) to capture user-specific inter-item relationships among item candidates by seamlessly accommodating the learned latent user intentions via IDM. As such, one can not only achieve more personalized recommendations but also obtain corresponding explanations by constructing RAISE upon existing recommendation engines. Empirical study on four public datasets shows the superiority of our proposed RAISE, with up to 13.95%, 9.60%, and 13.03% relative improvements evaluated by Precision@5, MAP@5, and NDCG@5 respectively.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Ministry of Education - Academic Research Fund Tier 1
Grant Reference no. : 2020-T1-001-130 (RG15/20)

This research / project is supported by the Ministry of Education - Academic Research Fund Tier 2
Grant Reference no. : MOE2019-T2-2-175
Description:
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
1041-4347
1558-2191
2326-3865
Files uploaded:

File Size Format Action
tkde-raise-final-1.pdf 5.15 MB PDF Open