TY - JOUR
T1 - Communication-Efficient and Distributed Learning over Wireless Networks
T2 - Principles and Applications
AU - Park, Jihong
AU - Samarakoon, Sumudu
AU - Elgabli, Anis
AU - Kim, Joongheon
AU - Bennis, Mehdi
AU - Kim, Seong Lyun
AU - Debbah, Merouane
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2021/5
Y1 - 2021/5
N2 - Machine learning (ML) is a promising enabler for the fifth-generation (5G) communication systems and beyond. By imbuing intelligence into the network edge, edge nodes can proactively carry out decision-making and, thereby, react to local environmental changes and disturbances while experiencing zero communication latency. To achieve this goal, it is essential to cater for high ML inference accuracy at scale under the time-varying channel and network dynamics, by continuously exchanging fresh data and ML model updates in a distributed way. Taming this new kind of data traffic boils down to improving the communication efficiency of distributed learning by optimizing communication payload types, transmission techniques, and scheduling, as well as ML architectures, algorithms, and data processing methods. To this end, this article aims to provide a holistic overview of relevant communication and ML principles and, thereby, present communication-efficient and distributed learning frameworks with selected use cases.
AB - Machine learning (ML) is a promising enabler for the fifth-generation (5G) communication systems and beyond. By imbuing intelligence into the network edge, edge nodes can proactively carry out decision-making and, thereby, react to local environmental changes and disturbances while experiencing zero communication latency. To achieve this goal, it is essential to cater for high ML inference accuracy at scale under the time-varying channel and network dynamics, by continuously exchanging fresh data and ML model updates in a distributed way. Taming this new kind of data traffic boils down to improving the communication efficiency of distributed learning by optimizing communication payload types, transmission techniques, and scheduling, as well as ML architectures, algorithms, and data processing methods. To this end, this article aims to provide a holistic overview of relevant communication and ML principles and, thereby, present communication-efficient and distributed learning frameworks with selected use cases.
KW - 6G
KW - beyond 5G
KW - beyond federated learning (FL)
KW - communication efficiency
KW - distributed machine learning
UR - http://www.scopus.com/inward/record.url?scp=85101758780&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101758780&partnerID=8YFLogxK
U2 - 10.1109/JPROC.2021.3055679
DO - 10.1109/JPROC.2021.3055679
M3 - Article
AN - SCOPUS:85101758780
SN - 0018-9219
VL - 109
SP - 796
EP - 819
JO - Proceedings of the IEEE
JF - Proceedings of the IEEE
IS - 5
M1 - 9357490
ER -