Collaborative distillation for top-N recommendation

Jae Woong Lee, Minjin Choi, Jongwuk Lee, Hyunjung Shim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

25 Citations (Scopus)

Abstract

Knowledge distillation (KD) is a well-known method to reduce inference latency by compressing a cumbersome teacher model to a small student model. Despite the success of KD in the classification task, applying KD to recommender models is challenging due to the sparsity of positive feedback, the ambiguity of missing feedback, and the ranking problem associated with the top-N recommendation. To address the issues, we propose a new KD model for the collaborative filtering approach, namely collaborative distillation (CD). Specifically, (1) we reformulate a loss function to deal with the ambiguity of missing feedback. (2) We exploit probabilistic rank-aware sampling for the top-N recommendation. (3) To train the proposed model effectively, we develop two training strategies for the student model, called the teacher-and the student-guided training methods, selecting the most useful feedback from the teacher model. Via experimental results, we demonstrate that the proposed model outperforms the state-of-the-art method by 5.5-29.7% and 4.8-27.8% in hit rate (HR) and normalized discounted cumulative gain (NDCG), respectively. Moreover, the proposed model achieves the performance comparable to the teacher model.

Original languageEnglish
Title of host publicationProceedings - 19th IEEE International Conference on Data Mining, ICDM 2019
EditorsJianyong Wang, Kyuseok Shim, Xindong Wu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages369-378
Number of pages10
ISBN (Electronic)9781728146034
DOIs
Publication statusPublished - 2019 Nov
Event19th IEEE International Conference on Data Mining, ICDM 2019 - Beijing, China
Duration: 2019 Nov 82019 Nov 11

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
Volume2019-November
ISSN (Print)1550-4786

Conference

Conference19th IEEE International Conference on Data Mining, ICDM 2019
Country/TerritoryChina
CityBeijing
Period19/11/819/11/11

Bibliographical note

Funding Information:
This work was supported by the National Research Foundation of Korea (NRF) grant (No. NRF-2018R1A2B6009135 and NRF-2019R1A2C2006123) and the Institute of Information & communications Technology Planning & Evaluation(IITP) grant funded by the Korea government (MSIT) (No.2019-0-00421, AI Graduate School Support Program).

Publisher Copyright:
© 2019 IEEE.

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint

Dive into the research topics of 'Collaborative distillation for top-N recommendation'. Together they form a unique fingerprint.

Cite this