Beat synchronous dance animation based on visual analysis of human motion and audio analysis of music tempo
2013 (English)In: Advances in Visual Computing: 9th International Symposium, ISVC 2013, Rethymnon, Crete, Greece, July 29-31, 2013. Proceedings, Part II, Springer Berlin/Heidelberg, 2013, Vol. 8034, 118-127 p.Conference paper (Refereed)
We present a framework that generates beat synchronous dance animation based on the analysis of both visual and audio data. First, the articulated motion of a dancer is captured based on markerless visual observations obtained by a multicamera system. We propose and employ a new method for the temporal segmentation of such motion data into the periods of dance. Next, we use a beat tracking algorithm to estimate the pulse related to the tempo of a piece of music. Given an input music that is of the same genre as the one corresponding to the visually observed dance, we automatically produce a beat synchronous dance animation of a virtual character. The proposed approach has been validated with extensive experiments performed on a data set containing a variety on traditional Greek/Cretan dances and the corresponding music.
Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2013. Vol. 8034, 118-127 p.
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743 ; 8034
Articulated motion, Audio analysis, Human motions, Multicamera systems, Temporal segmentations, Virtual character, Visual analysis, Visual observations
IdentifiersURN: urn:nbn:se:kth:diva-193773DOI: 10.1007/978-3-642-41939-3_12ScopusID: 2-s2.0-84888211834ISBN: 978-364241938-6OAI: oai:DiVA.org:kth-193773DiVA: diva2:1040344
9th International Symposium on Advances in Visual Computing, ISVC 2013. Rethymnon, Crete. Greece. 29 July 2013 through 31 July 2013
QC 201611142016-10-272016-10-102016-11-14Bibliographically approved