Interactive sonification of emotionally expressive gestures by means of music performance
2010 (English)In: Proceedings of ISon 2010, 3rd Interactive Sonification Workshop / [ed] Bresin, Roberto; Hermann, Thomas; Hunt, Andy, Stockholm, Sweden: KTH Royal Institute of Technology, 2010, 113-116 p.Conference paper (Refereed)
This study presents a procedure for interactive sonification of emotionally expressive hand and arm gestures by affecting a musical performance in real-time. Three different mappings are described that translate accelerometer data to a set of parameters that control the expressiveness of the performance by affecting tempo, dynamics and articulation. The first two mappings, tested with a numberof subjects during a public event, are relatively simple and were designed by the authors using a top-down approach. According to user feedback, they were not intuitive and limited the usability of the software. A bottom-up approach was taken for the third mapping: a Classification Tree was trained with features extracted from gesture data from a number of test subject who were asked toexpress different emotions with their hand movements. A second set of data, where subjects were asked to make a gesture that corresponded to a piece of expressive music they just listened to, wereused to validate the model. The results were not particularly accurate, but reflected the small differences in the data and the ratings given by the subjects to the different performances they listened to.
Place, publisher, year, edition, pages
Stockholm, Sweden: KTH Royal Institute of Technology, 2010. 113-116 p.
Computer Science Human Computer Interaction Music Psychology
IdentifiersURN: urn:nbn:se:kth:diva-52135OAI: oai:DiVA.org:kth-52135DiVA: diva2:465430
ISon 2010, 3rd Interactive Sonification Workshop, Stockholm, Sweden, April 7, 2010
tmh_import_11_12_14. QC 201112222011-12-142011-12-142016-08-22Bibliographically approved