Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Behavioural Responses to Robot Conversational Failures
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH.ORCID iD: 0000-0002-8874-6629
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH, Speech Communication and Technology.
KTH.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-3729-157x
Show others and affiliations
2020 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Humans and robots will increasingly collaborate in domestic environments which will cause users to encounter more failures in interactions. Robots should be able to infer conversational failures by detecting human users’ behavioural and social signals. In this paper, we study and analyse these behavioural cues in response to robot conversational failures. Using a guided task corpus, where robot embodiment and time pressure are manipulated, we ask human annotators to estimate whether user affective states differ during various types of robot failures. We also train a random forest classifier to detect whether a robot failure has occurred and compare results to human annotator benchmarks. Our findings show that human-like robots augment users’ reactions to failures, as shown in users’ visual attention, in comparison to non-humanlike smart-speaker embodiments. The results further suggest that speech behaviours are utilised more in responses to failures when non-human-like designs are present. This is particularly important to robot failure detection mechanisms that may need to consider the robot’s physical design in its failure detection model.

Place, publisher, year, edition, pages
ACM Digital Library, 2020.
National Category
Interaction Technologies
Identifiers
URN: urn:nbn:se:kth:diva-267231DOI: 10.1145/3319502.3374782ISBN: 978-1-4503-6746-2 (print)OAI: oai:DiVA.org:kth-267231DiVA, id: diva2:1391493
Conference
International Conference on Human Robot Interaction (HRI), HRI ’20, March 23–26, 2020, Cambridge, United Kingdom
Note

QC 20200214

Available from: 2020-02-04 Created: 2020-02-04 Last updated: 2020-02-14Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textConference website

Authority records BETA

Kontogiorgos, DimosthenisAbelho Pereira, André Tiagovan Waveren, SanneGustafson, Joakim

Search in DiVA

By author/editor
Kontogiorgos, DimosthenisAbelho Pereira, André TiagoSahindal, Boranvan Waveren, SanneGustafson, Joakim
By organisation
Speech, Music and Hearing, TMHSpeech Communication and TechnologyKTHRobotics, Perception and Learning, RPL
Interaction Technologies

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 48 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf