CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Time efficiency and mistake rates for online learning algorithms: A comparison between Online Gradient Descent and Second Order Perceptron algorithm and their performance on two different data sets
KTH, School of Electrical Engineering and Computer Science (EECS).
KTH, School of Electrical Engineering and Computer Science (EECS).
2019 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesisAlternative title
Tidseffektivitet och felfrekvenser hos online learning algoritmer : En jämförelse mellan Online Gradient Descent och Second Order Perceptron algoritmen och deras presterande på två olika dataset (Swedish)
Abstract [en]

This dissertation investigates the differences between two different online learning algorithms: Online Gradient Descent (OGD) and Second-Order Perceptron (SOP) algorithm, and how well they perform on different data sets in terms of mistake rate, time cost and number of updates. By studying different online learning algorithms and how they perform in different environments will help understand and develop new strategies to handle further online learning tasks. The study includes two different data sets, Pima Indians Diabetes and Mushroom, together with the LIBOL library for testing. The results in this dissertation show that Online Gradient Descent performs overall better concerning the tested data sets. In the first data set, Online Gradient Descent recorded a notably lower mistake rate. For the second data set, although it recorded a slightly higher mistake rate, the algorithm was remarkably more time efficient compared to Second-Order Perceptron. Future work would include a wider range of testing with more, and different, data sets as well as other relative algorithms. This will lead to better result and higher credibility.

Abstract [sv]

Den här avhandlingen undersöker skillnaden mellan två olika “online learning”-algoritmer: Online Gradient Descent och Second-Order Perceptron, och hur de presterar på olika datasets med fokus på andelen felklassificeringar, tidseffektivitet och antalet uppdateringar. Genom att studera olika “online learning”-algoritmer och hur de fungerar i olika miljöer, kommer det hjälpa till att förstå och utveckla nya strategier för att hantera vidare “online learning”-problem. Studien inkluderar två olika dataset, Pima Indians Diabetes och Mushroom, och använder biblioteket LIBOL för testning. Resultatet i denna avhandling visar att Online Gradient Descent presterar bättre överlag på de testade dataseten. För det första datasetet visade Online Gradient Descent ett betydligt lägre andel felklassificeringar. För det andra datasetet visade OGD lite högre andel felklassificeringar, men samtidigt var algoritmen anmärkningsvärt mer tidseffektiv i jämförelse med Second-Order Perceptron. Framtida studier inkluderar en bredare testning med mer, och olika, datasets och andra relaterade algoritmer. Det leder till bättre resultat och höjer trovärdigheten.

Place, publisher, year, edition, pages
2019. , p. 27
Series
TRITA-EECS-EX ; 2019:369
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-260087OAI: oai:DiVA.org:kth-260087DiVA, id: diva2:1354558
Supervisors
Examiners
Available from: 2019-10-09 Created: 2019-09-25 Last updated: 2019-10-09Bibliographically approved

Open Access in DiVA

fulltext(1200 kB)3 downloads
File information
File name FULLTEXT01.pdfFile size 1200 kBChecksum SHA-512
1336bd3713d7550a3379ff8f1620e12bcd55951d1ea8e9a17d1bf08f068fadaae1275d87a2e8ed20415fcea70d361e5a73c00ef482fba7463317abc7eab3ad05
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 3 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 19 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf