Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
System for comparing topic suggestion algorithms using multiple evaluation properties
KTH, School of Information and Communication Technology (ICT).
2014 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Recommender systems are core components for many companies. These companies constantly improve their functionality aiming to maximize user satisfaction from their products. The only evaluation of the recommender systems usually done by the companies is performed in an online experiment on actual data leaving them without any offline tools to consider and postponing the quality assessment to the time when the algorithm is deployed and used in production.

In this work, we describe a software evaluation tool for a selected recommender algorithm applicable for offline cases. We discuss different properties that are important for the assessment of the chosen algorithm, present the user behavior that best reflects expected real life attitude, debate various data sets (available on the Internet and provided by the company) suitable for the offline evaluation. We introduce an extensible software tool for offline assessment that is integrated into test environment created and maintained by Salesforce.com.

The tool aims to be exible allowing data sets, user behavior and metrics to be easily switched or used for evaluation of other recommender algorithms. We also describe a set of recommendations on how the selected algorithm could be improved supporting these enhancement suggestions with an evaluation performed using the implemented tool.

Place, publisher, year, edition, pages
2014. , 73 p.
Series
TRITA-ICT-EX, 2014:207
National Category
Computer and Information Science
Identifiers
URN: urn:nbn:se:kth:diva-188169OAI: oai:DiVA.org:kth-188169DiVA: diva2:934052
Educational program
Master of Science - Distributed Computing
Examiners
Available from: 2016-06-08 Created: 2016-06-08 Last updated: 2016-06-08Bibliographically approved

Open Access in DiVA

fulltext(1920 kB)59 downloads
File information
File name FULLTEXT01.pdfFile size 1920 kBChecksum SHA-512
f996e66e28de14c37f97c43c236ab7771e6df37025e92efd8114b2ec0886ee897e9c2081e6f00e7466883732098a9659fdd7c13733f2d600e7ff71dbda91e363
Type fulltextMimetype application/pdf

By organisation
School of Information and Communication Technology (ICT)
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 59 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 40 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf