FedMes: Speeding up Federated Learning with Multiple Edge Servers

Dong Jun Han, Minseok Choi, Jungwuk Park, Jaekyun Moon

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)

Abstract

We consider federated learning (FL) with multiple wireless edge servers having their own local coverage. We focus on speeding up training in this increasingly practical setup. Our key idea is to utilize the clients located in the overlapping coverage areas among adjacent edge servers (ESs); in the model-downloading stage, the clients in the overlapping areas receive multiple models from different ESs, take the average of the received models, and then update the averaged model with their local data. These clients send their updated model to multiple ESs by broadcasting, which acts as bridges for sharing the trained models between servers. Even when some ESs are given biased datasets within their coverage regions, their training processes can be assisted by adjacent servers through the clients in their overlapping regions. As a result, the proposed scheme does not require costly communications with the central cloud server (located at the higher tier of edge servers) for model synchronization, significantly reducing the overall training time compared to the conventional cloud-based FL systems. Extensive experimental results show remarkable performance gains of our scheme compared to existing methods. Our design targets latency-sensitive applications where edge-based FL is essential, e.g., when a number of connected cars/drones must cooperate (via FL) to quickly adapt to dynamically changing environments.

Original languageEnglish
Pages (from-to)3870-3885
Number of pages16
JournalIEEE Journal on Selected Areas in Communications
Volume39
Issue number12
DOIs
Publication statusPublished - 2021 Dec 1

Bibliographical note

Publisher Copyright:
© 1983-2012 IEEE.

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'FedMes: Speeding up Federated Learning with Multiple Edge Servers'. Together they form a unique fingerprint.

Cite this