Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
User Centred Evaluation in Experimental and Practical Settings
KTH, School of Computer Science and Communication (CSC), Media technology and interaction design.
2012 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The objective of this thesis is to obtain knowledge regarding how effective user centred evaluation methods are and how user centred evaluations are conducted by IT professionals. This will be achieved by exploring user centred evaluation in experimental and practical settings. The knowledge gained in these studies should inspire suggestions for further research and suggestions for improvements on the user centred evaluation activity.

Two experimental studies were conducted. One compares the results from using three user centred evaluation methods, and the other examines two factors while conducting heuristic evaluation. The results show that the think-aloud evaluation method was the most effective method in finding realistic usability problems of the three methods. The number of critical problems found during think-aloud evaluation increases, if heuristic evaluation is conducted prior to the think-aloud evaluations.

Further, two studies of user centred evaluation in practical settings were performed. The IT professionals participating in those studies were using the software development process Scrum to plan their work. The results show that user centred evaluation is infrequently conducted in Scrum projects, compared to testing activities like acceptance testing. The main type of evaluation is qualitative. Few participants measure user performance or use surveys to gather quantitative results on the usability and the user experience. IT professionals get feedback from users in an informal way and gather informal feedback from peers. Many participants use a mixture of methods for gathering feedback on their work.

The outcome of this thesis shows that IT professionals should be encouraged to include users whenever possible when evaluating software, for example by using the think-aloud method. Using heuristic evaluation prior to conducting think-aloud evaluations is also recommended. In addition, IT professionals are encouraged to evaluate their software in an informal way frequently, rather than waiting for the right time to conduct a thorough quantitative evaluation.

To advance this field further, researchers who want to improve the evaluation activity for the IT professionals should study how user centred evaluation methods could be combined in an efficient way and how the use of qualitative evaluation methods could be made more effective.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2012. , xii, 80 p.
Series
Trita-CSC-A, ISSN 1653-5723 ; 2012:05
Keyword [en]
User centred evaluation, Scrum, evaluation methods, agile software development
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:kth:diva-95302ISBN: 978-91-7501-357-2 (print)OAI: oai:DiVA.org:kth-95302DiVA: diva2:527518
Public defence
2012-06-08, F3, Lindstedtsvagen 26, KTH, Stockholm, 13:15 (English)
Opponent
Supervisors
Note
QC 20120522Available from: 2012-05-22 Created: 2012-05-21 Last updated: 2012-05-22Bibliographically approved
List of papers
1. Prediction of usability: Comparing method combinations
Open this publication in new window or tab >>Prediction of usability: Comparing method combinations
1999 (English)Conference paper, Published paper (Refereed)
Keyword
Usability evaluation methods, heuristic evaluation, cognitive walkthrough, think-aloud method
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-95296 (URN)
Conference
Managing Information Technology Resources in Organizations in the Next Millennium, Hershey, USA
Note
QC 20120522Available from: 2012-05-21 Created: 2012-05-21 Last updated: 2012-05-22Bibliographically approved
2. Heuristic evaluation: Comparing ways of finding and reporting usability problems
Open this publication in new window or tab >>Heuristic evaluation: Comparing ways of finding and reporting usability problems
2007 (English)In: Interacting with computers, ISSN 0953-5438, E-ISSN 1873-7951, Vol. 19, no 2, 225-240 p.Article in journal (Refereed) Published
Abstract [en]

Research on heuristic evaluation in recent years has focused on improving its effectiveness and efficiency with respect to user testing. The aim of this paper is to refine a research agenda for comparing and contrasting evaluation methods. To reach this goal, a framework is presented to evaluate the effectiveness of different types of support for structured usability problem reporting. This paper reports on an empirical study of this framework that compares two sets of heuristics, Nielsen's heuristics and the cognitive principles of Gerhardt-Powals, and. two media of reporting a usability problem, i.e. either using a web tool or paper. The study found that there were no significant differences between any of the four groups in effectiveness, efficiency and inter-evaluator reliability. A more significant contribution of this research is that the framework used for the experiments proved successful and should be reusable by other researchers because of its thorough structure.

Keyword
user interface, heuristic evaluation, reporting, web tool, effectiveness, efficiency, comparison framework
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-95312 (URN)10.1016/j.intcom.2006.10.001 (DOI)000244575900009 ()
Note

QC 20120522

Available from: 2012-05-22 Created: 2012-05-22 Last updated: 2017-12-07Bibliographically approved
3. The focus on usability in testing practices in industry
Open this publication in new window or tab >>The focus on usability in testing practices in industry
2010 (English)Conference paper, Published paper (Refereed)
Keyword
usability, software testing, agile development, Scrum, practitioners
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-95300 (URN)2-s2.0-84943635442 (Scopus ID)
Conference
Human-Computer Interaction Symposium at the World Computer Congress, Brisbane, Australia.
Note

QC 20120521

Available from: 2012-05-21 Created: 2012-05-21 Last updated: 2016-03-14Bibliographically approved
4. Informal feedback rather than performance measurements: user-centred evaluation in Scrum projects
Open this publication in new window or tab >>Informal feedback rather than performance measurements: user-centred evaluation in Scrum projects
2014 (English)In: Behavior and Information Technology, ISSN 0144-929X, E-ISSN 1362-3001, Vol. 33, no 11, 1118-1135 p.Article in journal (Refereed) Published
Abstract [en]

The main features of the Scrum process are intense communication between different stakeholders, and rapid feedback based on regular delivery of working software. The integration of traditional user-centred evaluation activities in this context is challenging. Hence, this paper presents an interview study of 21 informants, categorised in four different professional roles. The main contribution of the paper is an overview of the types of user-centred evaluation conducted by information technology professionals in various Scrum projects. Results show that various forms of feedback are indeed gathered on the usability and user experience of the software, system or service being developed. However, the user-centred evaluations conducted typically are informal with few users, gathering empirical qualitative data and performed during short unplanned sessions. Performance measurements gathering quantitative data are seldom used. The informants in the business specialist role merely ask users about their opinion, whereas the other roles use several user-centred evaluation activities to gather feedback on their design. Generally, feedback is gathered throughout the whole project, but often evaluation is conducted early in the project or even before the actual development starts. Finally, these results are discussed in relation to previous studies in the area.

Keyword
user-centred evaluation, usability, user experience, Scrum, human-centred activities, agile development
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-95301 (URN)10.1080/0144929X.2013.857430 (DOI)000342237100002 ()2-s2.0-84907249124 (Scopus ID)
Note

QC 20141027. Updated from submitted to published.

Available from: 2012-05-21 Created: 2012-05-21 Last updated: 2017-12-07Bibliographically approved

Open Access in DiVA

Thesis-Marta(743 kB)391 downloads
File information
File name FULLTEXT01.pdfFile size 743 kBChecksum SHA-512
31c5c916e55b0c02ec067053f48a6af35917ed7e3d6b49b68a56dfe6edbed8d6826282801c7e800442f8628172f0c71d0a332ea4621825360a9c6167146c49b2
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Larusdottir, Marta K.
By organisation
Media technology and interaction design
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 391 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 815 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf