Integration of gestures and speech in human-robot interaction
2012 (English)In: 3rd IEEE International Conference on Cognitive Infocommunications, CogInfoCom 2012 - Proceedings, IEEE , 2012, 673-678 p.Conference paper (Refereed)
We present an approach to enhance the interaction abilities of the Nao humanoid robot by extending its communicative behavior with non-verbal gestures (hand and head movements, and gaze following). A set of non-verbal gestures were identified that Nao could use for enhancing its presentation and turn-management capabilities in conversational interactions. We discuss our approach for modeling and synthesizing gestures on the Nao robot. A scheme for system evaluation that compares the values of users' expectations and actual experiences has been presented. We found that open arm gestures, head movements and gaze following could significantly enhance Nao's ability to be expressive and appear lively, and to engage human users in conversational interactions.
Place, publisher, year, edition, pages
IEEE , 2012. 673-678 p.
Arm gestures, Conversational interaction, Head movements, Human users, Humanoid robot, System evaluation
Computer Science Language Technology (Computational Linguistics)
IdentifiersURN: urn:nbn:se:kth:diva-109397DOI: 10.1109/CogInfoCom.2012.6421936ISI: 000320454200107ScopusID: 2-s2.0-84874426598ISBN: 978-146735187-4OAI: oai:DiVA.org:kth-109397DiVA: diva2:581691
3rd IEEE International Conference on Cognitive Infocommunications, CogInfoCom 2012; Kosice; 2 December 2012 through 5 December 2012
QC 201303272013-01-022013-01-022013-08-13Bibliographically approved