Communication-Efficient and Distributed Learning over Wireless Networks: Principles and Applications

Jihong Park, Sumudu Samarakoon, Anis Elgabli, Joongheon Kim, Mehdi Bennis, Seong Lyun Kim, Merouane Debbah

Research output: Contribution to journalArticlepeer-review

133 Citations (Scopus)

Abstract

Machine learning (ML) is a promising enabler for the fifth-generation (5G) communication systems and beyond. By imbuing intelligence into the network edge, edge nodes can proactively carry out decision-making and, thereby, react to local environmental changes and disturbances while experiencing zero communication latency. To achieve this goal, it is essential to cater for high ML inference accuracy at scale under the time-varying channel and network dynamics, by continuously exchanging fresh data and ML model updates in a distributed way. Taming this new kind of data traffic boils down to improving the communication efficiency of distributed learning by optimizing communication payload types, transmission techniques, and scheduling, as well as ML architectures, algorithms, and data processing methods. To this end, this article aims to provide a holistic overview of relevant communication and ML principles and, thereby, present communication-efficient and distributed learning frameworks with selected use cases.

Original languageEnglish
Article number9357490
Pages (from-to)796-819
Number of pages24
JournalProceedings of the IEEE
Volume109
Issue number5
DOIs
Publication statusPublished - 2021 May

Bibliographical note

Publisher Copyright:
© 1963-2012 IEEE.

All Science Journal Classification (ASJC) codes

  • General Computer Science
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Communication-Efficient and Distributed Learning over Wireless Networks: Principles and Applications'. Together they form a unique fingerprint.

Cite this