DC Field | Value | Language |
---|---|---|
dc.contributor.author | 이규석 | - |
dc.date.accessioned | 2024-05-10T16:37:33Z | - |
dc.date.available | 2024-05-10T16:37:33Z | - |
dc.date.issued | 2024 | - |
dc.identifier.other | OAK-2015-10422 | - |
dc.identifier.uri | http://postech.dcollection.net/common/orgView/200000732626 | ko_KR |
dc.identifier.uri | https://oasis.postech.ac.kr/handle/2014.oak/123374 | - |
dc.description | Master | - |
dc.description.abstract | Practical recommender systems (RS) necessitate both effectiveness and efficiency. Although knowledge distillation (KD) is a promising solution, its frequent execution in continually incoming non-stationary datasets is impractical due to high training costs. To solve this problem, we propose an integrated framework that combines KD and continual learning (CL) for real-world RS. This enables collaborative evolu- tion between teacher and student models in a dynamic environment with continuous training on ever-changing datasets. Furthermore, we introduce stability and plasticity student models to prevent catastrophic forgetting, focusing on historical and recent knowledge, respectively. Therefore, both student and teacher models can be trained with multi-faceted knowledge, enhancing their training over time. Our experiments demonstrate the superiority of the proposed framework, enhancing practicality in RS and bridging the industry-research gap in real-world RS applications. | - |
dc.language | eng | - |
dc.title | Collaborative Knowledge Distillation for Continual Learning in Recommender System | - |
dc.type | Thesis | - |
dc.contributor.college | 인공지능대학원 | - |
dc.date.degree | 2024- 2 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
library@postech.ac.kr Tel: 054-279-2548
Copyrights © by 2017 Pohang University of Science ad Technology All right reserved.