Extracting and analyzing head movements accompanying spontaneous dialogue
2013 (English)In: Conference Proceedings TiGeR 2013: Tilburg Gesture Research Meeting, 2013Conference paper (Refereed)
This paper reports on a method developed for extracting and analyzing head gestures taken from motion capture data of spontaneous dialogue in Swedish. Candidate head gestures with beat function were extracted automatically and then manually classified using a 3D player which displays timesynced audio and 3D point data of the motion capture markers together with animated characters. Prosodic features were extracted from syllables co-occurring with a subset of the classified gestures. The beat gestures show considerable variation in temporal synchronization with the syllables, while the syllables generally show greater intensity, higher F0, and greater F0 range when compared to the mean across the entire dialogue. Additional features for further analysis and automatic classification of the head gestures are discussed.
Place, publisher, year, edition, pages
Computer Science Language Technology (Computational Linguistics)
IdentifiersURN: urn:nbn:se:kth:diva-137400OAI: oai:DiVA.org:kth-137400DiVA: diva2:678907
TiGeR 2013: Tilburg Gesture Research Meeting: 10th International Gesture Workshop (GW) and 3rd Gesture and Speech in Interaction (GESPIN) Conference, Tilburg University, Netherlands 2013-06-19 2013-06-21
QC 201406042013-12-132013-12-132014-06-04Bibliographically approved