Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices
2012 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 5, no 3-4, 157-173 p.Article in journal (Refereed) Published
This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.
Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2012. Vol. 5, no 3-4, 157-173 p.
Interactive sonification, Interactive systems, Audio systems, Sound and music computing, Active music listening, Synchronisation
Computer Science Human Computer Interaction Psychology Media and Communication Technology
IdentifiersURN: urn:nbn:se:kth:diva-52200DOI: 10.1007/s12193-011-0079-zISI: 000309998300008ScopusID: 2-s2.0-84861014654OAI: oai:DiVA.org:kth-52200DiVA: diva2:465498
FunderEU, FP7, Seventh Framework Programme, 215749 SAMESwedish Research Council, 2010-4654
QC 201506232011-12-142011-12-142015-06-23Bibliographically approved