Adversarial Training with Maximal Coding Rate Reduction
2024 (English)In: Conference Record of the 58th Asilomar Conference on Signals, Systems and Computers, ACSSC 2024, Institute of Electrical and Electronics Engineers (IEEE) , 2024, p. 1866-1870Conference paper, Published paper (Refereed)
Abstract [en]
Deep convolutional networks can solve various complex tasks in the field of image processing. However, adversarial attacks have been shown to have the ability of fooling deep learning models. Adversarial training is one commonly used strategy to improve the robustness of deep learning models against adversarial examples, which is performed by incorporating adversarial examples into the training process. Traditionally, during this process, cross-entropy loss is used as the loss function. In order to improve the robustness of deep learning models against adversarial examples, we propose in this paper two new methods of adversarial training by applying the principle of Maximal Coding Rate Reduction (MCR2). We evaluate the performance of different adversarial training methods by comparing the clean accuracy and adversarial accuracy. It is shown that adversarial training with the MCR2 loss function yields a more robust network than the traditional adversarial training method. In our experiments, adversarial accuracies are improved by up to 10%. The two loss functions are discussed by using a model.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2024. p. 1866-1870
Keywords [en]
adversarial attack, adversarial example, adversarial training, deep neural networks, Machine learning, quadratic similarity queries on compressed data
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-362682DOI: 10.1109/IEEECONF60004.2024.10942802ISI: 001479671800342Scopus ID: 2-s2.0-105002685564OAI: oai:DiVA.org:kth-362682DiVA, id: diva2:1954124
Conference
58th Asilomar Conference on Signals, Systems and Computers, ACSSC 2024, Hybrid, Pacific Grove, United States of America, Oct 27 2024 - Oct 30 2024
Note
Part of ISBN 9798350354058
QC 20250425
2025-04-232025-04-232025-12-05Bibliographically approved