Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

정확성 및 효율성을 갖춘 추천 시스템을 위한 지식 증류 기술

Title
정확성 및 효율성을 갖춘 추천 시스템을 위한 지식 증류 기술
Authors
강성구
Date Issued
2023
Publisher
포항공과대학교
Abstract
In this era of information explosion, recommender systems are widely used in various industries to provide personalized user experience, playing a key role in promoting corporate profits. Recent recommender systems tend to adopt increasingly complex and large models to better understand the complex nature of user-item interactions. Large models with numerous parameters have high recommendation accuracy due to their excellent expressive power. However, they also incur correspondingly high computational costs as well as high latency for inference, which has become one of the major obstacles to deployment. To reduce the model size while maintaining accuracy, we focus on knowledge distillation (KD), a model-independent strategy that transfers knowledge from a well-trained large model to a compact model. The compact model trained with KD has an accuracy comparable to that of the large model and can make more efficient online inferences due to its small model size. Despite its breakthrough in classification problems, KD to recommendation models and ranking problems has not been studied well in the previous literature. This dissertation is devoted to developing knowledge distillation methods for recommender systems to fully improve the performance of a compact model. We propose novel distillation methods designed for recommender systems. The proposed methods are categorized according to their knowledge sources as follows: (1) Latent knowledge: we propose two methods that transfer latent knowledge of user/item representation. They effectively transfer knowledge of niche tastes with a balanced distillation strategy that prevents the KD process from being biased towards a small number of large preference groups. Also, we propose a new method that transfers user/item relations in the representation space. The proposed method selectively transfers essential relations considering the limited capacity of the compact model. (2) Ranking knowledge: we propose three methods that transfer ranking knowledge from the recommendation results. They formulate the KD process as a ranking matching problem and transfer the knowledge via a listwise learning strategy. Further, we present a new learning framework that compresses the ranking knowledge of heterogeneous recommendation models. The proposed framework is developed to ease the computational burdens of model ensemble which is a dominant solution for many recommendation applications. We validate the benefit of our proposed methods and frameworks through extensive experiments. To summarize, this dissertation sheds light on knowledge distillation approaches for a better accuracy-efficiency trade-off of the recommendation models.
URI
http://postech.dcollection.net/common/orgView/200000690173
https://oasis.postech.ac.kr/handle/2014.oak/118395
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse