MASCOT: A Quantization Framework for Efficient Matrix Factorization in Recommender Systems

Yunyong Ko, Jae Seo Yu, Hong Kyun Bae, Yongjun Park, Dongwon Lee, Sang Wook Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

In recent years, quantization methods have successfully accelerated the training of large deep neural network (DNN) models by reducing the level of precision in computing operations (e.g., forward/backward passes) without sacrificing its accuracy. In this work, therefore, we attempt to apply such a quantization idea to the popular Matrix factorization (MF) methods to deal with the growing scale of models and datasets in recommender systems. However, to our dismay, we observe that the state-of-the-art quantization methods are not effective in the training of MF models, unlike their successes in the training of DNN models. To this phenomenon, we posit that two distinctive features in training MF models could explain the difference: (i) the training of MF models is much more memory-intensive than that of DNN models, and (ii) the quantization errors across users and items in recommendation are not uniform. From these observations, we develop a quantization framework for MF models, named MASCOT, employing novel strategies (i.e., m-quantization and g-switching) to successfully address the aforementioned limitations of quantization in the training of MF models. The comprehensive evaluation using four real-world datasets demonstrates that MASCOT improves the training performance of MF models by about 45%, compared to the training without quantization, while maintaining low model errors, and the strategies and implementation optimizations of MASCOT are quite effective in the training of MF models. For the detailed information about MASCOT, we release the code of MASCOT and the datasets at: https://github.com/Yujaeseo/lCDM-2021_MASCOT.

Original languageEnglish
Title of host publicationProceedings - 21st IEEE International Conference on Data Mining, ICDM 2021
EditorsJames Bailey, Pauli Miettinen, Yun Sing Koh, Dacheng Tao, Xindong Wu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages290-299
Number of pages10
ISBN (Electronic)9781665423984
DOIs
Publication statusPublished - 2021
Event21st IEEE International Conference on Data Mining, ICDM 2021 - Virtual, Online, New Zealand
Duration: 2021 Dec 72021 Dec 10

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
Volume2021-December
ISSN (Print)1550-4786

Conference

Conference21st IEEE International Conference on Data Mining, ICDM 2021
Country/TerritoryNew Zealand
CityVirtual, Online
Period21/12/721/12/10

Bibliographical note

Funding Information:
ACKNOWLEDGMENT The work of Sang-Wook Kim was supported by Samsung Research Funding & Incubation Center of Samsung Electronics under Project Number SRFC-IT1901-03. The work of Dongwon Lee was supported by the NSF award #212114824.

Publisher Copyright:
© 2021 IEEE.

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint

Dive into the research topics of 'MASCOT: A Quantization Framework for Efficient Matrix Factorization in Recommender Systems'. Together they form a unique fingerprint.

Cite this