An Adaptive ML Framework for Power Converter Monitoring via Federated Transfer Learning
2025 (English)In: IEEE transactions on power electronics, ISSN 0885-8993, E-ISSN 1941-0107, Vol. 40, no 11, p. 16048-16054Article in journal (Refereed) Published
Abstract [en]
This study explores alternative framework configurations for adapting thermal machine learning models for power converters by combining transfer learning (TL) and federated learning (FL) in a piecewise manner. This approach inherently addresses challenges such as varying operating conditions, data sharing limitations, and security implications. The framework starts with a base model that is incrementally adapted by multiple clients via adapting three state-of-the-art domain adaptation techniques: fine-tuning, transfer component analysis, and deep domain adaptation. The Flower framework is employed for FL, using federated averaging for aggregation. Validation with field data demonstrates that fine-tuning offers a straightforward TL approach with high accuracy, making it suitable for practical applications. Benchmarking results reveal a comprehensive comparison of these methods, showcasing their respective strengths and weaknesses when applied in different scenarios. Locally hosted FL enhances performance when data aggregation is not feasible, whereas cloud-based FL becomes more practical with a significant increase in the number of clients, addressing scalability and connectivity challenges.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2025. Vol. 40, no 11, p. 16048-16054
Keywords [en]
Data models, Adaptation models, Training, Transfer learning, Accuracy, Monitoring, Federated learning, Servers, Scalability, Data aggregation, Domain adaptation, electric drives, federated learning (FL), power converter, transfer learning (TL)
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-373984DOI: 10.1109/TPEL.2025.3559132ISI: 001560475500030Scopus ID: 2-s2.0-105002687101OAI: oai:DiVA.org:kth-373984DiVA, id: diva2:2023044
Note
QC 20251218
2025-12-182025-12-182025-12-18Bibliographically approved