kth.sePublications KTH
Operational message
There are currently operational disruptions. Troubleshooting is in progress.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
An Adaptive ML Framework for Power Converter Monitoring via Federated Transfer Learning
ABB AB, Corp Res Ctr, S-72358 Västerås, Sweden.
ABB Oy, Mot Syst Dr, Helsinki 00380, Finland.
KTH, School of Electrical Engineering and Computer Science (EECS), Electrical Engineering, Electric Power and Energy Systems.ORCID iD: 0000-0001-6831-3474
2025 (English)In: IEEE transactions on power electronics, ISSN 0885-8993, E-ISSN 1941-0107, Vol. 40, no 11, p. 16048-16054Article in journal (Refereed) Published
Abstract [en]

This study explores alternative framework configurations for adapting thermal machine learning models for power converters by combining transfer learning (TL) and federated learning (FL) in a piecewise manner. This approach inherently addresses challenges such as varying operating conditions, data sharing limitations, and security implications. The framework starts with a base model that is incrementally adapted by multiple clients via adapting three state-of-the-art domain adaptation techniques: fine-tuning, transfer component analysis, and deep domain adaptation. The Flower framework is employed for FL, using federated averaging for aggregation. Validation with field data demonstrates that fine-tuning offers a straightforward TL approach with high accuracy, making it suitable for practical applications. Benchmarking results reveal a comprehensive comparison of these methods, showcasing their respective strengths and weaknesses when applied in different scenarios. Locally hosted FL enhances performance when data aggregation is not feasible, whereas cloud-based FL becomes more practical with a significant increase in the number of clients, addressing scalability and connectivity challenges.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2025. Vol. 40, no 11, p. 16048-16054
Keywords [en]
Data models, Adaptation models, Training, Transfer learning, Accuracy, Monitoring, Federated learning, Servers, Scalability, Data aggregation, Domain adaptation, electric drives, federated learning (FL), power converter, transfer learning (TL)
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-373984DOI: 10.1109/TPEL.2025.3559132ISI: 001560475500030Scopus ID: 2-s2.0-105002687101OAI: oai:DiVA.org:kth-373984DiVA, id: diva2:2023044
Note

QC 20251218

Available from: 2025-12-18 Created: 2025-12-18 Last updated: 2025-12-18Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Peretti, Luca

Search in DiVA

By author/editor
Peretti, Luca
By organisation
Electric Power and Energy Systems
In the same journal
IEEE transactions on power electronics
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 38 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf