Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A systematic study of the class imbalance problem in convolutional neural networks
Duke Univ, Dept Radiol, Sch Med, Durham, NC 27710 USA.;KTH Royal Inst Technol, Sch Elect Engn & Comp Sci, Stockholm, Sweden..
KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL.ORCID iD: 0000-0002-4266-6746
Duke Univ, Dept Radiol, Sch Med, Durham, NC 27710 USA.;Duke Univ, Dept Elect & Comp Engn, Durham, NC USA..
2018 (English)In: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 106, p. 249-259Article in journal (Refereed) Published
Abstract [en]

In this study, we systematically investigate the impact of class imbalance on classification performance of convolutional neural networks (CNNs) and compare frequently used methods to address the issue. Class imbalance is a common problem that has been comprehensively studied in classical machine learning, yet very limited systematic research is available in the context of deep learning. In our study, we use three benchmark datasets of increasing complexity, MNIST, CIFAR-10 and ImageNet, to investigate the effects of imbalance on classification and perform an extensive comparison of several methods to address the issue: oversampling, undersampling, two-phase training, and thresholding that compensates for prior class probabilities. Our main evaluation metric is area under the receiver operating characteristic curve (ROC AUC) adjusted to multi-class tasks since overall accuracy metric is associated with notable difficulties in the context of imbalanced data. Based on results from our experiments we conclude that (i) the effect of class imbalance on classification performance is detrimental; (ii) the method of addressing class imbalance that emerged as dominant in almost all analyzed scenarios was oversampling; (iii) oversampling should be applied to the level that completely eliminates the imbalance, whereas the optimal undersampling ratio depends on the extent of imbalance; (iv) as opposed to some classical machine learning models, oversampling does not cause overfitting of CNNs; (v) thresholding should be applied to compensate for prior class probabilities when overall number of properly classified cases is of interest. 

Place, publisher, year, edition, pages
PERGAMON-ELSEVIER SCIENCE LTD , 2018. Vol. 106, p. 249-259
Keywords [en]
Class imbalance, Convolutional neural networks, Deep learning, Image classification
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-235561DOI: 10.1016/j.neunet.2018.07.011ISI: 000445015200021PubMedID: 30092410Scopus ID: 2-s2.0-85050996431OAI: oai:DiVA.org:kth-235561DiVA, id: diva2:1252286
Note

QC 20181001

Available from: 2018-10-01 Created: 2018-10-01 Last updated: 2018-10-01Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records BETA

Maki, Atsuto

Search in DiVA

By author/editor
Maki, Atsuto
By organisation
Robotics, perception and learning, RPL
In the same journal
Neural Networks
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 111 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf