kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Comparison of CNN and Transformer in Continual Learning
KTH, School of Electrical Engineering and Computer Science (EECS).
2023 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesisAlternative title
En jämförelse mellan CNN och Transformer för kontinuerlig Inlärning (Swedish)
Abstract [en]

Within the realm of computer vision tasks, Convolutional Neural Networks (CNN) and Transformers represent two predominant methodologies, often subject to extensive comparative analyses elucidating their respective merits and demerits. This thesis embarks on an exploration of these two models within the framework of continual learning, with a specific focus on their propensities for resisting catastrophic forgetting. We hypothesize that Transformer models exhibit a higher resilience to catastrophic forgetting in comparison to their CNN counterparts. To substantiate this hypothesis, a meticulously crafted experimental design was implemented, involving the selection of diverse models and continual learning approaches, and careful tuning of the networks to ensure an equitable comparison. In the majority of conducted experiments, encompassing both the contexts of class incremental learning settings and task incremental learning settings, our results substantiate the aforementioned hypothesis. Nevertheless, the insights garnered also underscore the necessity for more exhaustive and encompassing experimental evaluations to fully validate the asserted hypothesis.

Abstract [sv]

Inom datorseende är Convolutional Neural Networks (CNN) och Transformers två dominerande metoder, som ofta är föremål för omfattande jämförande analyser som belyser deras respektive fördelar och nackdelar. Denna avhandling utforskar dessa två modeller inom ramen för kontinuerligt lärande, med särskilt fokus på deras benägenhet att motstå katastrofal glömska. Vi antar att Transformer-modeller uppvisar en ökad motståndskraft mot katastrofal glömska i jämförelse med deras CNN-motsvarigheter. För att underbygga denna hypotes implementerades en noggrant utformad experimentell design, som involverade val av olika modeller och kontinuerliga inlärningstekniker, och noggrann inställning av nätverken för att säkerställa en rättvis jämförelse. I majoriteten av de genomförda experimenten, som omfattade både inkrementell klassinlärning och inkrementell uppgiftsinlärning, bekräftade våra resultat den ovannämnda hypotesen. De insikter vi fått understryker dock också behovet av mer uttömmande och omfattande experimentella utvärderingar för att fullt ut validera den påstådda hypotesen.

Place, publisher, year, edition, pages
2023. , p. 48
Series
TRITA-EECS-EX ; 2023:793
Keywords [en]
Convolutional Neural Network, Transformer, Continual Learning, Image Classification
Keywords [sv]
Faltade Neurala Nätverk, Transformator, Kontinuerligt Lärande, Bildklassificering
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-340947OAI: oai:DiVA.org:kth-340947DiVA, id: diva2:1820229
Supervisors
Examiners
Available from: 2024-01-03 Created: 2023-12-17 Last updated: 2024-01-03Bibliographically approved

Open Access in DiVA

fulltext(2466 kB)2121 downloads
File information
File name FULLTEXT01.pdfFile size 2466 kBChecksum SHA-512
81c30bc708d262c2248dcb082c2f60db229047f3a43acce4f6f2f0c331c4c4579501a7acb217789e6f214985b997d1e3d11000c4cc8908b82dcc48b43cebe880
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 2123 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 844 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf