Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.author이규석-
dc.date.accessioned2024-05-10T16:37:33Z-
dc.date.available2024-05-10T16:37:33Z-
dc.date.issued2024-
dc.identifier.otherOAK-2015-10422-
dc.identifier.urihttp://postech.dcollection.net/common/orgView/200000732626ko_KR
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/123374-
dc.descriptionMaster-
dc.description.abstractPractical recommender systems (RS) necessitate both effectiveness and efficiency. Although knowledge distillation (KD) is a promising solution, its frequent execution in continually incoming non-stationary datasets is impractical due to high training costs. To solve this problem, we propose an integrated framework that combines KD and continual learning (CL) for real-world RS. This enables collaborative evolu- tion between teacher and student models in a dynamic environment with continuous training on ever-changing datasets. Furthermore, we introduce stability and plasticity student models to prevent catastrophic forgetting, focusing on historical and recent knowledge, respectively. Therefore, both student and teacher models can be trained with multi-faceted knowledge, enhancing their training over time. Our experiments demonstrate the superiority of the proposed framework, enhancing practicality in RS and bridging the industry-research gap in real-world RS applications.-
dc.languageeng-
dc.titleCollaborative Knowledge Distillation for Continual Learning in Recommender System-
dc.typeThesis-
dc.contributor.college인공지능대학원-
dc.date.degree2024- 2-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse