Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
A systematic study of the class imbalance problem in convolutional neural networks
Duke Univ, Dept Radiol, Sch Med, Durham, NC 27710 USA.;KTH Royal Inst Technol, Sch Elect Engn & Comp Sci, Stockholm, Sweden..
KTH, Skolan för elektroteknik och datavetenskap (EECS), Robotik, perception och lärande, RPL.ORCID-id: 0000-0002-4266-6746
Duke Univ, Dept Radiol, Sch Med, Durham, NC 27710 USA.;Duke Univ, Dept Elect & Comp Engn, Durham, NC USA..
2018 (Engelska)Ingår i: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 106, s. 249-259Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

In this study, we systematically investigate the impact of class imbalance on classification performance of convolutional neural networks (CNNs) and compare frequently used methods to address the issue. Class imbalance is a common problem that has been comprehensively studied in classical machine learning, yet very limited systematic research is available in the context of deep learning. In our study, we use three benchmark datasets of increasing complexity, MNIST, CIFAR-10 and ImageNet, to investigate the effects of imbalance on classification and perform an extensive comparison of several methods to address the issue: oversampling, undersampling, two-phase training, and thresholding that compensates for prior class probabilities. Our main evaluation metric is area under the receiver operating characteristic curve (ROC AUC) adjusted to multi-class tasks since overall accuracy metric is associated with notable difficulties in the context of imbalanced data. Based on results from our experiments we conclude that (i) the effect of class imbalance on classification performance is detrimental; (ii) the method of addressing class imbalance that emerged as dominant in almost all analyzed scenarios was oversampling; (iii) oversampling should be applied to the level that completely eliminates the imbalance, whereas the optimal undersampling ratio depends on the extent of imbalance; (iv) as opposed to some classical machine learning models, oversampling does not cause overfitting of CNNs; (v) thresholding should be applied to compensate for prior class probabilities when overall number of properly classified cases is of interest. 

Ort, förlag, år, upplaga, sidor
PERGAMON-ELSEVIER SCIENCE LTD , 2018. Vol. 106, s. 249-259
Nyckelord [en]
Class imbalance, Convolutional neural networks, Deep learning, Image classification
Nationell ämneskategori
Data- och informationsvetenskap
Identifikatorer
URN: urn:nbn:se:kth:diva-235561DOI: 10.1016/j.neunet.2018.07.011ISI: 000445015200021PubMedID: 30092410Scopus ID: 2-s2.0-85050996431OAI: oai:DiVA.org:kth-235561DiVA, id: diva2:1252286
Anmärkning

QC 20181001

Tillgänglig från: 2018-10-01 Skapad: 2018-10-01 Senast uppdaterad: 2018-10-01Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextPubMedScopus

Personposter BETA

Maki, Atsuto

Sök vidare i DiVA

Av författaren/redaktören
Maki, Atsuto
Av organisationen
Robotik, perception och lärande, RPL
I samma tidskrift
Neural Networks
Data- och informationsvetenskap

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetricpoäng

doi
pubmed
urn-nbn
Totalt: 196 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf