kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Communication-Efficient Semi-Decentralized Federated Learning in the Presence of Stragglers
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-1649-1943
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-5407-0835
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-7926-5081
2025 (English)In: IEEE Transactions on Communications, ISSN 0090-6778, E-ISSN 1558-0857, Vol. 73, no 12, p. 13999-14013Article in journal (Refereed) Published
Abstract [en]

In this paper, we consider the problem of federated learning (FL) with devices that have intermittent connectivity to the central server. For this problem, the concept of semi-decentralized FL has been proposed in the literature. This paradigm allows non-straggler devices to relay the gradients computed by the stragglers to the server, and enables realization of gradient coding (GC) to mitigate the negative impact of the stragglers that fail to communicate directly to the central server. However, for GC in semi-decentralized FL, the communication overhead caused by information transmission among the devices is significant. To overcome this shortcoming, inspired by the existing communication-optimal exact consensus algorithm (CECA), we propose a new communication-efficient semi-decentralized FL method (COFFEE). In each round, the devices exchange information by taking a certain number of steps towards communication-optimal exact consensus, ensuring that each device obtains the average of the gradients computed by both its previous neighbors and itself. Afterwards, the non-stragglers transmit the local average result to the server for global aggregation to update the global model. We analyze the convergence performance and the communication overhead of COFFEE analytically. Building on this, to further enhance learning performance under a specific communication overhead, we propose an enhanced version of COFFEE with an adaptive aggregation rule at the central server, referred to as A-COFFEE, which adjusts to the straggler pattern of the devices over training rounds. Experiments are conducted to verify that the proposed methods outperform the baseline methods.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2025. Vol. 73, no 12, p. 13999-14013
Keywords [en]
communication efficiency, federated learning, intermittent connectivity, stragglers
National Category
Control Engineering Communication Systems
Identifiers
URN: urn:nbn:se:kth:diva-370075DOI: 10.1109/TCOMM.2025.3605479Scopus ID: 2-s2.0-105015207710OAI: oai:DiVA.org:kth-370075DiVA, id: diva2:1999711
Note

QC 20250922

Available from: 2025-09-22 Created: 2025-09-22 Last updated: 2025-12-30Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Li, ChengxiXiao, MingSkoglund, Mikael

Search in DiVA

By author/editor
Li, ChengxiXiao, MingSkoglund, Mikael
By organisation
Information Science and Engineering
In the same journal
IEEE Transactions on Communications
Control EngineeringCommunication Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 48 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf