kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements
KTH, School of Electrical Engineering and Computer Science (EECS), Human Centered Technology, Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-3086-0322
University College Cork National University of Ireland: Cork, IE.ORCID iD: 0000-0002-9933-8583
KTH, School of Electrical Engineering and Computer Science (EECS), Human Centered Technology, Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-2659-0411
KTH, School of Electrical Engineering and Computer Science (EECS), Human Centered Technology, Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-4422-5223
2020 (English)In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 144Article in journal (Refereed) Published
Abstract [en]

Existing works on interactive sonification of movements, i.e., the translation of human movement qualities from the physical to the auditory domain, usually adopt a predetermined approach: the way in which movement features modulate the characteristics of sound is fixed. In our work we want to go one step further and demonstrate that the user role can influence the tuning of the mapping between movement cues and sound parameters. Here, we aim to verify if and how the mapping changes when the user is either the performer or the observer of a series of body movements (tracing a square or an infinite shape with the hand in the air). We asked participants to tune movement sonification while they were directly performing the sonified movement vs. while watching another person performing the movement and listening to its sonification. Results show that the tuning of the sonification chosen by participants is influenced by three variables: role of the user (performer vs observer), movement quality (the amount of Smoothness and Directness in the movement), and physical parameters of the movements (velocity and acceleration). Performers focused more on the quality of their movement, while observers focused more on the sonic rendering, making it more expressive and more connected to low-level physical features.

Place, publisher, year, edition, pages
Elsevier BV , 2020. Vol. 144
Keywords [en]
Sonification, Mapping, Hand movement, Performance, User role
National Category
Other Engineering and Technologies Other Engineering and Technologies Computer and Information Sciences Other Computer and Information Science
Research subject
Media Technology; Human-computer Interaction
Identifiers
URN: urn:nbn:se:kth:diva-277059DOI: 10.1016/j.ijhcs.2020.102500ISI: 000573482500002Scopus ID: 2-s2.0-85086799093OAI: oai:DiVA.org:kth-277059DiVA, id: diva2:1446247
Projects
DANCE
Funder
EU, Horizon 2020, 645553
Note

QC 20200819

Available from: 2020-06-24 Created: 2020-06-24 Last updated: 2025-02-18Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopushttp://www.sciencedirect.com/science/article/pii/S1071581920301026

Authority records

Bresin, RobertoElblaus, LudvigFrid, Emma

Search in DiVA

By author/editor
Bresin, RobertoMancini, MaurizioElblaus, LudvigFrid, Emma
By organisation
Media Technology and Interaction Design, MID
In the same journal
International journal of human-computer studies
Other Engineering and TechnologiesOther Engineering and TechnologiesComputer and Information SciencesOther Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 418 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf