Open this publication in new window or tab >>2025 (English)In: IEEE Transactions on Signal Processing, ISSN 1053-587X, E-ISSN 1941-0476, Vol. 73, p. 26-39Article in journal (Refereed) Published
Abstract [en]
This paper explores constrained non-convex personalized federated learning (PFL), in which a group of workers train local models and a global model, under the coordination of a server. To address the challenges of efficient information exchange and robustness against the so-called Byzantine workers, we propose a projected stochastic gradient descent algorithm for PFL that simultaneously ensures Byzantine-robustness and communication efficiency. We implement personalized learning at the workers aided by the global model, and employ a Huber function-based robust aggregation with an adaptive threshold-selecting strategy at the server to reduce the effects of Byzantine attacks. To improve communication efficiency, we incorporate random communication that allows multiple local updates per communication round. We establish the convergence of our algorithm, showing the effects of Byzantine attacks, random communication, and stochastic gradients on the learning error. Numerical experiments demonstrate the superiority of our algorithm in neural network training compared to existing ones.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2025
Keywords
Servers, Stochastic processes, Signal processing algorithms, Data models, Computational modeling, Vectors, Federated learning, Convergence, Adaptation models, Robustness, Personalized federated learning, communication efficiency, Byzantine-robustness, constrained non-convex optimization
National Category
Signal Processing
Identifiers
urn:nbn:se:kth:diva-358531 (URN)10.1109/TSP.2024.3514802 (DOI)001386428800008 ()2-s2.0-85211976780 (Scopus ID)
Note
QC 20250120
2025-01-202025-01-202025-01-20Bibliographically approved