kth.sePublikationer KTH
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Accelerating Energy-Efficient Federated Learning in Cell-Free Networks With Adaptive Quantization
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Kommunikationssystem, CoS.ORCID-id: 0000-0001-8826-2088
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0002-5407-0835
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Kommunikationssystem, CoS.ORCID-id: 0000-0002-5954-434X
2025 (Engelska)Ingår i: IEEE Transactions on Machine Learning in Communications and Networking, ISSN 2831-316X, Vol. 3, s. 761-778Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Federated Learning (FL) enables clients to share model parameters instead of raw data, reducing communication overhead. Traditional wireless networks, however, suffer from latency issues when supporting FL. Cell-Free Massive MIMO (CFmMIMO) offers a promising alternative, as it can serve multiple clients simultaneously on shared resources, enhancing spectral efficiency and reducing latency in large-scale FL. Still, communication resource constraints at the client side can impede the completion of FL training. To tackle this issue, we propose a low-latency, energy-efficient FL framework with optimized uplink power allocation for efficient uplink communication. Our approach integrates an adaptive quantization strategy that dynamically adjusts bit allocation for local gradient updates, significantly lowering communication cost. We formulate a joint optimization problem involving FL model updates, local iterations, and power allocation. This problem is solved using sequential quadratic programming (SQP) to balance energy consumption and latency. Moreover, for local model training, clients employ the AdaDelta optimizer, which improves convergence compared to standard SGD, Adam, and RMSProp. We also provide a theoretical analysis of FL convergence under AdaDelta. Numerical results demonstrate that, under equal energy and latency budgets, our power allocation strategy improves test accuracy by up to 7% and 19% compared to Dinkelbach and max-sum rate approaches. Furthermore, across all power allocation methods, our quantization scheme outperforms AQUILA and LAQ, increasing test accuracy by up to 36% and 35%, respectively.

Ort, förlag, år, upplaga, sidor
Institute of Electrical and Electronics Engineers (IEEE) , 2025. Vol. 3, s. 761-778
Nyckelord [en]
Quantization (signal), Training, Convergence, Optimization, Adaptation models, Uplink, Resource management, Data models, Costs, Accuracy, Federated learning, cell-free massive MIMO networks, adaptive quantization, power allocation, energy efficiency, straggler effect
Nationell ämneskategori
Telekommunikation
Identifikatorer
URN: urn:nbn:se:kth:diva-371455DOI: 10.1109/TMLCN.2025.3583659ISI: 001522923100001Scopus ID: 2-s2.0-105027859876OAI: oai:DiVA.org:kth-371455DiVA, id: diva2:2020738
Anmärkning

Not duplicate with DiVA 1927527

QC 20260129

Tillgänglig från: 2025-12-11 Skapad: 2025-12-11 Senast uppdaterad: 2026-01-29Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Mahmoudi, AfsanehXiao, MingBjörnson, Emil

Sök vidare i DiVA

Av författaren/redaktören
Mahmoudi, AfsanehXiao, MingBjörnson, Emil
Av organisationen
Kommunikationssystem, CoSTeknisk informationsvetenskap
Telekommunikation

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 20 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf