Expressive Control of Music and Visual Media by Full-Body Movement
2007 (English)In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, NIME '07, New York, NY, USA: ACM Press, 2007, 390-391 p.Conference paper (Refereed)
In this paper we describe a system which allows users to use their full-body for controlling in real-time the generation of an expressive audio-visual feedback. The system extracts expressive motion features from the user’s full-body movements and gestures. The values of these motion features are mapped both onto acoustic parameters for the real-time expressive rendering ofa piece of music, and onto real-time generated visual feedback projected on a screen in front of the user.
Place, publisher, year, edition, pages
New York, NY, USA: ACM Press, 2007. 390-391 p.
Expressive interaction, Interactive music systems, Multimodal environments
Computer Science Human Computer Interaction Computer Vision and Robotics (Autonomous Systems) Psychology Music
IdentifiersURN: urn:nbn:se:kth:diva-52083DOI: 10.1145/1279740.1279829ScopusID: 2-s2.0-77953565929OAI: oai:DiVA.org:kth-52083DiVA: diva2:465377
7th International Conference on New Interfaces for Musical Expression, NIME '07; New York, NY; United States; 6 June 2007 through 10 June 2007
tmh_import_11_12_14. QC 20111230 QC 201507102011-12-142011-12-142015-07-10Bibliographically approved