Adaptive Expert Models for Federated LearningVise andre og tillknytning
2023 (engelsk)Inngår i: Trustworthy Federated Learning: First International Workshop, FL 2022 / [ed] Goebel, R Yu, H Faltings, B Fan, L Xiong, Z, Springer Nature , 2023, Vol. 13448, s. 1-16Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]
Federated Learning (FL) is a promising framework for distributed learning when data is private and sensitive. However, the state-of-the-art solutions in this framework are not optimal when data is heterogeneous and non-IID. We propose a practical and robust approach to personalization in FL that adjusts to heterogeneous and non-IID data by balancing exploration and exploitation of several global models. To achieve our aim of personalization, we use a Mixture of Experts (MoE) that learns to group clients that are similar to each other, while using the global models more efficiently. We show that our approach achieves an accuracy up to 29.78% better than the state-of-the-art and up to 4.38% better compared to a local model in a pathological non-IID setting, even though we tune our approach in the IID setting.
sted, utgiver, år, opplag, sider
Springer Nature , 2023. Vol. 13448, s. 1-16
Serie
Lecture Notes in Artificial Intelligence, ISSN 2945-9133
Emneord [en]
Federated learning, Personalization, Privacy preserving
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-330493DOI: 10.1007/978-3-031-28996-5_1ISI: 000999818400001Scopus ID: 2-s2.0-85152565856OAI: oai:DiVA.org:kth-330493DiVA, id: diva2:1777921
Konferanse
Trustworthy Federated Learning - First International Workshop, FL 2022, Held in Conjunction with IJCAI 2022, Vienna, Austria, July 23, 2022
Merknad
Part of proceedings ISBN 978-3-031-28995-8 978-3-031-28996-5
QC 20230630
2023-06-302023-06-302024-05-27bibliografisk kontrollert