Automatic Frustration Detection Using Thermal ImagingShow others and affiliations
2022 (English)In: PROCEEDINGS OF THE 2022 17TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '22), Institute of Electrical and Electronics Engineers (IEEE) , 2022, p. 451-460Conference paper, Published paper (Refereed)
Abstract [en]
To achieve seamless interactions, robots have to be capable of reliably detecting affective states in real time. One of the possible states that humans go through while interacting with robots is frustration. Detecting frustration from RGB images can be challenging in some real-world situations; thus, we investigate in this work whether thermal imaging can be used to create a model that is capable of detecting frustration induced by cognitive load and failure. To train our model, we collected a data set from 18 participants experiencing both types of frustration induced by a robot. The model was tested using features from several modalities: thermal, RGB, Electrodermal Activity (EDA), and all three combined. When data from both frustration cases were combined and used as training input, the model reached an accuracy of 89% with just RGB features, 87% using only thermal features, 84% using EDA, and 86% when using all modalities. Furthermore, the highest accuracy for the thermal data was reached using three facial regions of interest: nose, forehead and lower lip.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2022. p. 451-460
Series
ACM IEEE International Conference on Human-Robot Interaction, ISSN 2167-2121
Keywords [en]
Human-robot interaction, Thermal imaging, Frustration, cognitive load, Action units
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:kth:diva-322478DOI: 10.1109/HRI53351.2022.9889545ISI: 000869793600050Scopus ID: 2-s2.0-85140750883OAI: oai:DiVA.org:kth-322478DiVA, id: diva2:1719935
Conference
17th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI), MAR 07-10, 2022, ELECTR NETWORK
Note
Part of proceedings: ISBN 978-1-6654-0731-1
QC 20221216
2022-12-162022-12-162022-12-16Bibliographically approved