kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Layer-wise Efficient Federated Learning with Distributed Clustering and D2D Communications
Sony (China) Research Laboratory, Beijing, China.
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.ORCID iD: 0000-0002-0094-8948
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.ORCID iD: 0000-0002-3211-4710
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.ORCID iD: 0000-0001-9810-3478
2024 (English)In: 2024 IEEE 25th International Workshop on Signal Processing Advances in Wireless Communications, SPAWC 2024, Institute of Electrical and Electronics Engineers (IEEE) , 2024, p. 831-835Conference paper, Published paper (Refereed)
Abstract [en]

This paper explores the integration of device-to-device communications into clustered federated learning (FL), where clients are grouped into multiple clusters based on the similarity of their learning tasks. To mitigate communication costs, we propose an efficient FL algorithm. Specifically, we designate a primary client within each cluster responsible for uploading the model to the server, while other clients within the cluster serve as secondary clients. Each secondary client assesses its model's similarity to the primary client by computing a layer-wise model distance. If a secondary client's model distance exceeds a predefined threshold, indicating a divergence from the primary client's model, it transmits its model distance to the edge server. The primary client then updates the cluster model parameters and broadcasts them to the secondary clients within the cluster. Closed-form expressions for the time spent by the proposed layer-wise efficient FL is derived. Numerical results validate the training accuracy of the layer-wise efficient FL and demonstrate a notable reduction in communication costs compared to naive FL.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2024. p. 831-835
Keywords [en]
client clustering, communication cost, device-to-device communications, Federated learning, layer-wise
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-355485DOI: 10.1109/SPAWC60668.2024.10694390ISI: 001337964100167Scopus ID: 2-s2.0-85207079025OAI: oai:DiVA.org:kth-355485DiVA, id: diva2:1909473
Conference
25th IEEE International Workshop on Signal Processing Advances in Wireless Communications, SPAWC 2024, Lucca, Italy, Sep 10 2024 - Sep 13 2024
Note

Part of ISBN 9798350393187

QC 20241105

Available from: 2024-10-30 Created: 2024-10-30 Last updated: 2025-01-20Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Liu, XiangnanHuang, XinyuFischione, Carlo

Search in DiVA

By author/editor
Liu, XiangnanHuang, XinyuFischione, Carlo
By organisation
Network and Systems Engineering
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 89 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf