Item-side ranking regularized distillation for recommender system
SCIE
SCOPUS
- Title
- Item-side ranking regularized distillation for recommender system
- Authors
- SeongKu Kang; Junyoung Hwang; Wonbin Kweon; Hwanjo Yu
- Date Issued
- 2021-11
- Publisher
- Elsevier BV
- Abstract
- Recent recommender system (RS) have adopted large and sophisticated model architecture to better understand the complex user-item relationships, and accordingly, the size of the recommender is continuously increasing. To reduce the high inference costs of the large recommender, knowledge distillation (KD), which is a model compression technique from a large pre-trained model (teacher) to a small model (student), has been actively studied for RS. The state-of-the-art method is based on the ranking distillation approach, which makes the student preserve the ranking orders among items predicted by the teacher. In this work, we propose a new regularization method designed to maximize the effect of the ranking distillation in RS. We first point out an important limitation and a room for improvement of the state-of-the-art ranking distillation method based on our in-depth analysis.Then, we introduce the item-side ranking regularization, which can effectively prevent the student with limited capacity from being overfitted and enables the student to more accurately learn the teacher’s prediction results. We validate the superiority of the proposed method by extensive experiments on real-world datasets.
- URI
- https://oasis.postech.ac.kr/handle/2014.oak/107243
- DOI
- 10.1016/j.ins.2021.08.060
- ISSN
- 0020-0255
- Article Type
- Article
- Citation
- Information Sciences, vol. 580, page. 15 - 34, 2021-11
- Files in This Item:
- There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.