Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Interactive Sonification of Spontaneous Movement of Children€: Cross-Modal Mapping and the Perception of Body Movement Qualities through Sound
KTH, School of Computer Science and Communication (CSC), Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-4422-5223
KTH, School of Computer Science and Communication (CSC), Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-3086-0322
KTH, School of Computer Science and Communication (CSC), Media Technology and Interaction Design, MID. (Sound and Music Computing)ORCID iD: 0000-0002-2659-0411
2016 (English)In: Frontiers in Neuroscience, ISSN 1662-4548, E-ISSN 1662-453X, Vol. 10, 521Article in journal (Refereed) Published
Abstract [en]

In this paper we present three studies focusing on the effect of different sound models in interactive sonification of bodily movement. We hypothesized that a sound model characterized by continuous smooth sounds would be associated with other movement characteristics than a model characterized by abrupt variation in amplitude and that these associations could be reflected in spontaneous movement characteristics. Three subsequent studies were conducted to investigate the relationship between properties of bodily movement and sound: (1) a motion capture experiment involving interactive sonification of a group of children spontaneously moving in a room, (2) an experiment involving perceptual ratings of sonified movement data and (3) an experiment involving matching between sonified movements and their visualizations in the form of abstract drawings. In (1) we used a system constituting of 17 IR cameras tracking passive reflective markers. The head positions in the horizontal plane of 3-4 children were simultaneously tracked and sonified, producing 3-4 sound sources spatially displayed through an 8-channel loudspeaker system. We analyzed children’s spontaneous movement in terms of energy-, smoothness- and directness index. Despite large inter-participant variability and group-specific effects caused by interaction among children when engaging in the spontaneous movement task, we found a small but significant effect of sound model. Results from (2) indicate that different sound models can be rated differently on a set of motion-related perceptual scales (e.g. expressivity and fluidity). Also, results imply that audio-only stimuli can evoke stronger perceived properties of movement (e.g. energetic, impulsive) than stimuli involving both audio and video representations. Findings in (3) suggest that sounds portraying bodily movement can be represented using abstract drawings in a meaningful way. We argue that the results from these studies support the existence of a cross-modal mapping of body motion qualities from bodily movement to sounds. Sound can be translated and understood from bodily motion, conveyed through sound visualizations in the shape of drawings and translated back from sound visualizations to audio. The work underlines the potential of using interactive sonification to communicate high-level features of human movement data.

Place, publisher, year, edition, pages
Frontiers Media S.A. , 2016. Vol. 10, 521
Keyword [en]
interactive sonification, movement analysis, movement sonification, mapping, motion capture, perception
National Category
Media and Communication Technology Human Computer Interaction Other Computer and Information Science
Research subject
Media Technology; Human-computer Interaction; Speech and Music Communication
Identifiers
URN: urn:nbn:se:kth:diva-196225DOI: 10.3389/fnins.2016.00521ISI: 000387532100001Scopus ID: 2-s2.0-85009797073OAI: oai:DiVA.org:kth-196225DiVA: diva2:1046616
Projects
DANCE
Funder
EU, Horizon 2020, 6455533
Note

QC 20161117

Available from: 2016-11-14 Created: 2016-11-14 Last updated: 2017-05-23Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopushttp://journal.frontiersin.org/article/10.3389/fnins.2016.00521

Search in DiVA

By author/editor
Frid, EmmaBresin, RobertoElblaus, Ludvig
By organisation
Media Technology and Interaction Design, MID
In the same journal
Frontiers in Neuroscience
Media and Communication TechnologyHuman Computer InteractionOther Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 656 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf