Data Augmentation Method for Transformer Fault Based on Improved Auto-Encoder Under the Condition of Insufficient Data
2021 (English)In: Diangong Jishu Xuebao/Transactions of China Electrotechnical Society, ISSN 1000-6753, Vol. 36, p. 84-94Article in journal (Refereed) Published
Abstract [en]
There are few transformer faults, which makes the methods of transformer fault diagnosis based on machine learning lack of data. For this reason, a method based on improved auto-encoder (IAE) is proposed to augment transformer fault data. Firstly, to solve the problem of limited data and lack of diversity in the traditional automatic encoder, an improved strategy for generating samples for transformer faults is proposed. Secondly, considering that the traditional convolutional neural network will lose a lot of feature information in the pooling operation, the improved convolutional neural network (ICNN) is constructed as the classifier of fault diagnosis. Finally, the effectiveness and adaptability of the proposed method are verified by the actual data. The simulation results show that IAE can take into account the distribution and diversity of data at the same time, and the generated transformer fault data can improve the performance of the classifier better than the traditional augmentation methods such random over-sampling method, synthetic minority over-sampling technique, and auto-encoder. Compared with traditional classifiers, ICNN has higher fault diagnosis accuracy before and after data augmentation.
Place, publisher, year, edition, pages
China Machine Press , 2021. Vol. 36, p. 84-94
Keywords [en]
Fault diagnosis, Improved auto-encoder, Insufficient data, Transformer, Classification (of information), Convolution, Convolutional neural networks, Fault detection, Signal encoding, Augmentation methods, Auto encoders, Convolutional neural network, Data augmentation, Fault data, Faults diagnosis, Transformer faults, Failure analysis
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-312317DOI: 10.19595/j.cnki.1000-6753.tces.L90083Scopus ID: 2-s2.0-85117044545OAI: oai:DiVA.org:kth-312317DiVA, id: diva2:1660190
Note
QC 20220523
2022-05-232022-05-232023-11-06Bibliographically approved