Federated Learning (FL) is a promising framework for distributed learning whendata is private and sensitive. However, the state-of-the-art solutions in thisframework are not optimal when data is heterogeneous and non-Independent andIdentically Distributed (non-IID). We propose a practical and robust approachto personalization in FL that adjusts to heterogeneous and non-IID data bybalancing exploration and exploitation of several global models. To achieve ouraim of personalization, we use a Mixture of Experts (MoE) that learns to groupclients that are similar to each other, while using the global models moreefficiently. We show that our approach achieves an accuracy up to 29.78 % andup to 4.38 % better compared to a local model in a pathological non-IIDsetting, even though we tune our approach in the IID setting.
QC 20220628