kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Design Implications for Effective Robot Gaze Behaviors in Multiparty Interactions
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0001-7130-0826
Yale Univ, New Haven, CT USA..
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0002-2212-4325
2022 (English)In: PROCEEDINGS OF THE 2022 17TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '22), Institute of Electrical and Electronics Engineers (IEEE) , 2022, p. 976-980Conference paper, Published paper (Refereed)
Abstract [en]

Human-robot non-verbal communication has been a growing focus of research, as we realize its importance to achieve interaction goals (e.g. modulating turn-taking) and manage human perception of the interaction. Consequently, the development of models for robot non-verbal behavior, such as gaze, should be informed by studies of human reaction and perception to that behavior. Here, we look at data from two studies where two humans interact describing words to a robot. The robot tries to balance participation of the two players through a combination of gaze aversion, looking at the listener and looking at the speaker. We analyze how momentary gaze patterns reflect in the participant's turn length and perception of the robot, as well as in the participation imbalance. Our findings may be used as recommendations towards crafting robot gaze behaviors in multiparty interactions.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2022. p. 976-980
Series
ACM IEEE International Conference on Human-Robot Interaction, ISSN 2167-2121
Keywords [en]
multiparty interaction, gaze, non-verbal behavior, social robotics
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:kth:diva-322475DOI: 10.1109/HRI53351.2022.9889481ISI: 000869793600142Scopus ID: 2-s2.0-85139511204OAI: oai:DiVA.org:kth-322475DiVA, id: diva2:1719943
Conference
17th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI), MAR 07-10, 2022, ELECTR NETWORK
Note

Part of proceedings: ISBN 978-1-6654-0731-1

QC 20221216

Available from: 2022-12-16 Created: 2022-12-16 Last updated: 2022-12-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Parreira, Maria TeresaGillet, SarahLeite, Iolanda

Search in DiVA

By author/editor
Parreira, Maria TeresaGillet, SarahLeite, Iolanda
By organisation
Robotics, Perception and Learning, RPL
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 62 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf