Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Collaborative Knowledge Distillation for Continual Learning in Recommender System

Title
Collaborative Knowledge Distillation for Continual Learning in Recommender System
Authors
이규석
Date Issued
2024
Abstract
Practical recommender systems (RS) necessitate both effectiveness and efficiency. Although knowledge distillation (KD) is a promising solution, its frequent execution in continually incoming non-stationary datasets is impractical due to high training costs. To solve this problem, we propose an integrated framework that combines KD and continual learning (CL) for real-world RS. This enables collaborative evolu- tion between teacher and student models in a dynamic environment with continuous training on ever-changing datasets. Furthermore, we introduce stability and plasticity student models to prevent catastrophic forgetting, focusing on historical and recent knowledge, respectively. Therefore, both student and teacher models can be trained with multi-faceted knowledge, enhancing their training over time. Our experiments demonstrate the superiority of the proposed framework, enhancing practicality in RS and bridging the industry-research gap in real-world RS applications.
URI
http://postech.dcollection.net/common/orgView/200000732626
https://oasis.postech.ac.kr/handle/2014.oak/123374
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse