Abstract
In recent years, quantization methods have successfully accelerated the training of large deep neural network (DNN) models by reducing the level of precision in computing operations (e.g., forward/backward passes) without sacrificing its accuracy. In this work, therefore, we attempt to apply such a quantization idea to the popular Matrix factorization (MF) methods to deal with the growing scale of models and datasets in recommender systems. However, to our dismay, we observe that the state-of-the-art quantization methods are not effective in the training of MF models, unlike their successes in the training of DNN models. To this phenomenon, we posit that two distinctive features in training MF models could explain the difference: (i) the training of MF models is much more memory-intensive than that of DNN models, and (ii) the quantization errors across users and items in recommendation are not uniform. From these observations, we develop a quantization framework for MF models, named MASCOT, employing novel strategies (i.e., m-quantization and g-switching) to successfully address the aforementioned limitations of quantization in the training of MF models. The comprehensive evaluation using four real-world datasets demonstrates that MASCOT improves the training performance of MF models by about 45%, compared to the training without quantization, while maintaining low model errors, and the strategies and implementation optimizations of MASCOT are quite effective in the training of MF models. For the detailed information about MASCOT, we release the code of MASCOT and the datasets at: https://github.com/Yujaeseo/lCDM-2021_MASCOT.
Original language | English |
---|---|
Title of host publication | Proceedings - 21st IEEE International Conference on Data Mining, ICDM 2021 |
Editors | James Bailey, Pauli Miettinen, Yun Sing Koh, Dacheng Tao, Xindong Wu |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 290-299 |
Number of pages | 10 |
ISBN (Electronic) | 9781665423984 |
DOIs | |
Publication status | Published - 2021 |
Event | 21st IEEE International Conference on Data Mining, ICDM 2021 - Virtual, Online, New Zealand Duration: 2021 Dec 7 → 2021 Dec 10 |
Publication series
Name | Proceedings - IEEE International Conference on Data Mining, ICDM |
---|---|
Volume | 2021-December |
ISSN (Print) | 1550-4786 |
Conference
Conference | 21st IEEE International Conference on Data Mining, ICDM 2021 |
---|---|
Country/Territory | New Zealand |
City | Virtual, Online |
Period | 21/12/7 → 21/12/10 |
Bibliographical note
Funding Information:ACKNOWLEDGMENT The work of Sang-Wook Kim was supported by Samsung Research Funding & Incubation Center of Samsung Electronics under Project Number SRFC-IT1901-03. The work of Dongwon Lee was supported by the NSF award #212114824.
Publisher Copyright:
© 2021 IEEE.
All Science Journal Classification (ASJC) codes
- Engineering(all)