Analysis of gaze and speech patterns in three-party quiz game interaction
2013 (English)In: Interspeech 2013, 2013, 1126-1130 p.Conference paper (Refereed)
In order to understand and model the dynamics between interaction phenomena such as gaze and speech in face-to-face multiparty interaction between humans, we need large quantities of reliable, objective data of such interactions. To date, this type of data is in short supply. We present a data collection setup using automated, objective techniques in which we capture the gaze and speech patterns of triads deeply engaged in a high-stakes quiz game. The resulting corpus consists of five one-hour recordings, and is unique in that it makes use of three state-of-the-art gaze trackers (one per subject) in combination with a state-of-theart conical microphone array designed to capture roundtable meetings. Several video channels are also included. In this paper we present the obstacles we encountered and the possibilities afforded by a synchronised, reliable combination of large-scale multi-party speech and gaze data, and an overview of the first analyses of the data. Index Terms: multimodal corpus, multiparty dialogue, gaze patterns, multiparty gaze.
Place, publisher, year, edition, pages
2013. 1126-1130 p.
Computer Science Language Technology (Computational Linguistics)
IdentifiersURN: urn:nbn:se:kth:diva-137388ScopusID: 2-s2.0-84906231582OAI: oai:DiVA.org:kth-137388DiVA: diva2:678921
14th Annual Conference of the International Speech Communication Association, Lyon, France, August 25-29, 2013. ISCA 2013
QC 201406032013-12-132013-12-132014-06-03Bibliographically approved