kth.sePublications KTH
Change search
Link to record
Permanent link

Direct link
Publications (10 of 16) Show all publications
Shirol, S., Delfa, J. L., Leite, I. & Yadollahi, E. (2025). Designing Social Behaviours for Autonomous Mobile Robots: The Role of Movement and Light in Communicating Intent. In: HRI 2025 - Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction: . Paper presented at 20th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2025, Melbourne, Australia, Mar 4 2025 - Mar 6 2025 (pp. 1638-1643). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Designing Social Behaviours for Autonomous Mobile Robots: The Role of Movement and Light in Communicating Intent
2025 (English)In: HRI 2025 - Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Institute of Electrical and Electronics Engineers (IEEE) , 2025, p. 1638-1643Conference paper, Published paper (Refereed)
Abstract [en]

When autonomous mobile robots (AMRs) share space with humans, establishing trust becomes essential for safe, seamless, and effective interaction. Clear communication of a robot's intent is key to building trust by reducing uncertainty and enabling intuitive interaction. This study explores how AMRs can effectively communicate their intentions through simple, intuitive modalities like movement and light, making their actions more predictable and fostering trust. We designed distinct movement cues combined with light patterns to communicate two key intents; yielding (backing off) and making way (prompting humans to move), tested across four different scenarios. To evaluate the clarity and effectiveness of these behaviours, we conducted an online video study analysing qualitative feedback from open-ended responses. Additionally, we collected quantitative data assessing participants' perceptions of the safety and trustworthiness of the robot. Our findings demonstrate a strong correlation between these perceptions and the robot's ability to display socially aware behaviours.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2025
Keywords
Human-Robot Interaction, Intent Communication, Interaction Design, Multi-modal Human-Robot Interaction, Social Robotics
National Category
Robotics and automation Other Engineering and Technologies Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-363757 (URN)10.1109/HRI61500.2025.10973845 (DOI)2-s2.0-105004879113 (Scopus ID)
Conference
20th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2025, Melbourne, Australia, Mar 4 2025 - Mar 6 2025
Note

Part of ISBN 9798350378931

QC 20250528

Available from: 2025-05-21 Created: 2025-05-21 Last updated: 2025-05-28Bibliographically approved
Khanna, P., Naoum, A., Yadollahi, E., Björkman, M. & Smith, C. (2025). REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations. In: Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction: . Paper presented at ACM/IEEE International Conference on Human-Robot Interaction, HRI, Melbourne, Australia, March 4-6, 2025 (pp. 1032-1036). IEEE
Open this publication in new window or tab >>REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations
Show others...
2025 (English)In: Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, IEEE , 2025, p. 1032-1036Conference paper, Published paper (Refereed)
Abstract [en]

This work presents REFLEX: Robotic Explanations to FaiLures and Human EXpressions, a comprehensive multimodal dataset capturing human reactions to robot failures and subsequent explanations in collaborative settings. It aims to facilitate research into human-robot interaction dynamics, addressing the need to study reactions to both initial failures and explanations, as well as the evolution of these reactions in long-term interactions. By providing rich, annotated data on human responses to different types of failures, explanation levels, and explanation varying strategies, the dataset contributes to the development of more robust, adaptive, and satisfying robotic systems capable of maintaining positive relationships with human collaborators, even during challenges like repeated failures

Place, publisher, year, edition, pages
IEEE, 2025
Keywords
Human Robot Interaction, Dataset, Robotic Failures, Explainable AI.
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-360946 (URN)10.5555/3721488.3721616 (DOI)
Conference
ACM/IEEE International Conference on Human-Robot Interaction, HRI, Melbourne, Australia, March 4-6, 2025
Note

QC 20250310

Available from: 2025-03-06 Created: 2025-03-06 Last updated: 2025-03-10Bibliographically approved
Read, J., Chisik, Y., Yadollahi, E. & Horton, M. (2024). Children and Emerging Technologies: Ethical and Practical Research and Design. In: CHI 2024 - Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Sytems: . Paper presented at 2024 CHI Conference on Human Factors in Computing Sytems, CHI EA 2024, Hybrid, Honolulu, United States of America, May 11 2024 - May 16 2024. Association for Computing Machinery (ACM), Article ID 593.
Open this publication in new window or tab >>Children and Emerging Technologies: Ethical and Practical Research and Design
2024 (English)In: CHI 2024 - Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Sytems, Association for Computing Machinery (ACM) , 2024, article id 593Conference paper, Published paper (Refereed)
Abstract [en]

Child Computer Interaction is concerned with the research, design, and evaluation of interactive technologies for children. Working with children in HCI is rewarding and fun but managing that work so that children are kept comfortable and can participate in meaningful ways is not always easy. This course will provide attendees with practical tips to organise sessions with children, with signposts to methods for research, design and evaluation and will specifically consider the ethics of children's participation with checklists to support us in doing our most ethical work possible. Our focus on emerging technologies makes this course especially valuable to those looking at AI, robots, XR and related technologies.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2024
Keywords
Child Computer Interaction, Children, Design, Emerging Technologies, Ethics, Evaluation, Research
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:kth:diva-347318 (URN)10.1145/3613905.3636275 (DOI)001227587700010 ()2-s2.0-85194198459 (Scopus ID)
Conference
2024 CHI Conference on Human Factors in Computing Sytems, CHI EA 2024, Hybrid, Honolulu, United States of America, May 11 2024 - May 16 2024
Note

QC 20240613

Part of ISBN 979-840070331-7

Available from: 2024-06-10 Created: 2024-06-10 Last updated: 2025-02-18Bibliographically approved
Yadollahi, E., Romeo, M., Dogan, F. I., Johal, W., De Graaf, M., Levy-Tzedek, S. & Leite, I. (2024). Explainability for Human-Robot Collaboration. In: HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction: . Paper presented at 19th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2024, Boulder, United States of America, Mar 11 2024 - Mar 15 2024 (pp. 1364-1366). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Explainability for Human-Robot Collaboration
Show others...
2024 (English)In: HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction, Association for Computing Machinery (ACM) , 2024, p. 1364-1366Conference paper, Published paper (Refereed)
Abstract [en]

In human-robot collaboration, explainability bridges the communication gap between complex machine functionalities and humans. An active area of investigation in robotics and AI is understanding and generating explanations that can enhance collaboration and mutual understanding between humans and machines. A key to achieving such seamless collaborations is understanding end-users, whether naive or expert, and tailoring explanation features that are intuitive, user-centred, and contextually relevant. Advancing on the topic not only includes modelling humans' expectations for generating the explanations but also requires the development of metrics to evaluate generated explanations and assess how effectively autonomous systems communicate their intentions, actions, and decision-making rationale. This workshop is designed to tackle the nuanced role of explainability in enhancing the efficiency, safety, and trust in human-robot collaboration. It aims to initiate discussions on the importance of generating and evaluating explainability features developed in autonomous agents. Simultaneously, it addresses various challenges, including bias in explainability and downsides of explainability and deception in human-robot interaction.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2024
Keywords
Explainable Robotics, Human-Centered Robot Explanations, XAI
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-344807 (URN)10.1145/3610978.3638154 (DOI)001255070800301 ()2-s2.0-85188063647 (Scopus ID)
Conference
19th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2024, Boulder, United States of America, Mar 11 2024 - Mar 15 2024
Note

QC 20240409

Part of ISBN 9798400703232

Available from: 2024-03-28 Created: 2024-03-28 Last updated: 2024-10-11Bibliographically approved
Yadollahi, E., Ligthart, M. E. .., Sharma, K. & Rubegni, E. (2024). ExTra CTI: Explainable and Transparent Child-Technology Interaction. In: Proceedings of ACM Interaction Design and Children Conference: Inclusive Happiness, IDC 2024: . Paper presented at 23rd Annual ACM Interaction Design and Children Conference, IDC 2024, Delft, Netherlands, Kingdom of the, Jun 17 2024 - Jun 20 2024 (pp. 1016-1019). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>ExTra CTI: Explainable and Transparent Child-Technology Interaction
2024 (English)In: Proceedings of ACM Interaction Design and Children Conference: Inclusive Happiness, IDC 2024, Association for Computing Machinery (ACM) , 2024, p. 1016-1019Conference paper, Published paper (Refereed)
Abstract [en]

When the technology encompasses some form of intelligence or agency in the form of robots, virtual agents or artificial intelligence, understanding the reasoning behind their actions and decisions becomes an integral part of the interaction. This challenge extends beyond mere interaction to ensure these technological entities engage with children in an understandable and transparent manner. Given the current emergence of research in explainability and transparency within human-robot interaction, a noticeable gap emerges when the target population shifts to children. Several challenges have contributed to this gap, including the more difficult job of considering children's unique cognitive and emotional needs or aligning the complexity of the technology and the developmental stages of young users. As we advance the field through generating more effective explanations or transparent behaviours in robots and agents, transitioning these advancements to more child-centric contexts demands a deeper understanding of how children perceive and comprehend technological behaviours. This workshop explores this gap and how we could tackle the critical role of developing technologies, e.g., robots, AI, and toys that are more transparent and express more explainable behaviours. We aim to initiate discussions on the importance of understanding children's perception of different technologies and approaches to generate and evaluate explainability features that are tailored for child users interacting with autonomous agents and robots. Simultaneously, we address the challenges inherent in this context, including potential biases in explainability and the risks associated with deception in child-technology interaction.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2024
Keywords
Child-Computer Interaction, Child-Robot Interaction, Explainable Robotics, Human-Centered Robot Explanations, XAI
National Category
Human Computer Interaction Robotics and automation Other Engineering and Technologies
Identifiers
urn:nbn:se:kth:diva-350712 (URN)10.1145/3628516.3661151 (DOI)001253706300120 ()2-s2.0-85197848899 (Scopus ID)
Conference
23rd Annual ACM Interaction Design and Children Conference, IDC 2024, Delft, Netherlands, Kingdom of the, Jun 17 2024 - Jun 20 2024
Note

Part of ISBN 9798400704420

QC 20240719

Available from: 2024-07-17 Created: 2024-07-17 Last updated: 2025-12-05Bibliographically approved
Oliveira, R. & Yadollahi, E. (2024). Robots in movies: a content analysis of the portrayal of fictional social robots. Behavior and Information Technology, 43(5), 970-987
Open this publication in new window or tab >>Robots in movies: a content analysis of the portrayal of fictional social robots
2024 (English)In: Behavior and Information Technology, ISSN 0144-929X, E-ISSN 1362-3001, Vol. 43, no 5, p. 970-987Article in journal (Refereed) Published
Abstract [en]

Movies and news reports represent for many the first source of interaction with social robots. Congruently, as tools for the dissemination of popular representations of robots, movies can have a direct impact on public perception, acceptance, and discourse about this type of technology. In this article, a content analysis of popular movies and franchises involving (fictional) social robots was conducted (k = 34). With this analysis, we sought to understand a) the main tropes used in movies involving robotic characters, b) the type of human-robot relationships depicted in those movies, and c) how the fictional robots compared with real robots in terms of their abilities. The results suggest that robots tend to be typically depicted in a polarized way that either emphasizes their extreme social abilities or their violent and destructive motives, with the former being slightly more prevalent. As a result, the relations between humans and robots tend to be either friendship or antagonism. Fictional robots are often portrayed as having advanced technical abilities that allow them to navigate multiple complex social settings and engage in different occupations typically performed by humans, in contrast with the abilities held by the most popular commercially available robots we have today.

Place, publisher, year, edition, pages
Informa UK Limited, 2024
Keywords
media, Movies, robots, science fiction
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-367475 (URN)10.1080/0144929X.2023.2196576 (DOI)000961482700001 ()2-s2.0-85151414455 (Scopus ID)
Note

QC 20250718

Available from: 2025-07-18 Created: 2025-07-18 Last updated: 2025-07-18Bibliographically approved
Torre, I., Holk, S., Yadollahi, E., Leite, I., McDonnell, R. & Harte, N. (2024). Smiling in the Face and Voice of Avatars and Robots: Evidence for a ‘smiling McGurk Effect’. IEEE Transactions on Affective Computing, 15(2), 393-404
Open this publication in new window or tab >>Smiling in the Face and Voice of Avatars and Robots: Evidence for a ‘smiling McGurk Effect’
Show others...
2024 (English)In: IEEE Transactions on Affective Computing, E-ISSN 1949-3045, Vol. 15, no 2, p. 393-404Article in journal (Refereed) Published
Abstract [en]

Multisensory integration influences emotional perception, as the McGurk effect demonstrates for the communication between humans. Human physiology implicitly links the production of visual features with other modes like the audio channel: Face muscles responsible for a smiling face also stretch the vocal cords that result in a characteristic smiling voice. For artificial agents capable of multimodal expression, this linkage is modeled explicitly. In our studies, we observe the influence of visual and audio channels on the perception of the agents' emotional expression. We created videos of virtual characters and social robots either with matching or mismatching emotional expressions in the audio and visual channels. In two online studies, we measured the agents' perceived valence and arousal. Our results consistently lend support to the ‘emotional McGurk effect' hypothesis, according to which face transmits valence information, and voice transmits arousal. When dealing with dynamic virtual characters, visual information is enough to convey both valence and arousal, and thus audio expressivity need not be congruent. When dealing with robots with fixed facial expressions, however, both visual and audio information need to be present to convey the intended expression.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2024
Keywords
Face recognition, Faces, Human-Likeness, multisensory integration, Muscles, Robots, smiling, Social robots, Videos, virtual agent, Visualization, Behavioral research, Muscle, Virtual reality, Audio channels, Face, Human likeness, McGurk effect, Video, Visual channels
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-328339 (URN)10.1109/TAFFC.2022.3213269 (DOI)001236687600001 ()2-s2.0-85139846163 (Scopus ID)
Note

QC 20230608

Available from: 2023-06-08 Created: 2023-06-08 Last updated: 2024-06-17Bibliographically approved
Rajabi, N., Khanna, P., Kanik, S. U. D., Yadollahi, E., Vasco, M., Björkman, M., . . . Kragic, D. (2023). Detecting the Intention of Object Handover in Human-Robot Collaborations: An EEG Study. In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN: . Paper presented at 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA (pp. 549-555). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Detecting the Intention of Object Handover in Human-Robot Collaborations: An EEG Study
Show others...
2023 (English)In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 549-555Conference paper, Published paper (Refereed)
Abstract [en]

Human-robot collaboration (HRC) relies on smooth and safe interactions. In this paper, we focus on the human-to-robot handover scenario, where the robot acts as a taker. We investigate the feasibility of detecting the intention of a human-to-robot handover action through the analysis of electroencephalogram (EEG) signals. Our study confirms that temporal patterns in EEG signals provide information about motor planning and can be leveraged to predict the likelihood of an individual executing a motor task with an average accuracy of 94.7%. We also suggest the effectiveness of the time-frequency features of EEG signals in the final second prior to the movement for distinguishing between handover action and other actions. Furthermore, we classify human intentions for different tasks based on time-frequency representations of pre-movement EEG signals and achieve an average accuracy of 63.5% for contrasting every two tasks against each other. The result encourages the possibility of using EEG signals to detect human handover intention in HRC tasks.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Series
IEEE RO-MAN, ISSN 1944-9445
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-342040 (URN)10.1109/RO-MAN57019.2023.10309426 (DOI)001108678600078 ()2-s2.0-85186991854 (Scopus ID)
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA
Note

Part of proceedings ISBN 979-8-3503-3670-2

QC 20240110

Available from: 2024-01-10 Created: 2024-01-10 Last updated: 2025-02-09Bibliographically approved
Khanna, P., Yadollahi, E., Björkman, M., Leite, I. & Smith, C. (2023). Effects of Explanation Strategies to Resolve Failures in Human-Robot Collaboration. In: 2023 32nd IEEE international conference on robot and human interactive communication, RO-MAN: . Paper presented at 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA (pp. 1829-1836). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Effects of Explanation Strategies to Resolve Failures in Human-Robot Collaboration
Show others...
2023 (English)In: 2023 32nd IEEE international conference on robot and human interactive communication, RO-MAN, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 1829-1836Conference paper, Published paper (Refereed)
Abstract [en]

Despite significant improvements in robot capabilities, they are likely to fail in human-robot collaborative tasks due to high unpredictability in human environments and varying human expectations. In this work, we explore the role of explanation of failures by a robot in a human-robot collaborative task. We present a user study incorporating common failures in collaborative tasks with human assistance to resolve the failure. In the study, a robot and a human work together to fill a shelf with objects. Upon encountering a failure, the robot explains the failure and the resolution to overcome the failure, either through handovers or humans completing the task. The study is conducted using different levels of robotic explanation based on the failure action, failure cause, and action history, and different strategies in providing the explanation over the course of repeated interaction. Our results show that the success in resolving the failures is not only a function of the level of explanation but also the type of failures. Furthermore, while novice users rate the robot higher overall in terms of their satisfaction with the explanation, their satisfaction is not only a function of the robot's explanation level at a certain round but also the prior information they received from the robot.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Series
IEEE RO-MAN, ISSN 1944-9445
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-341981 (URN)10.1109/RO-MAN57019.2023.10309394 (DOI)001108678600235 ()2-s2.0-85187011787 (Scopus ID)
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA
Note

Part of proceedings ISBN 979-8-3503-3670-2

QC 20240109

Available from: 2024-01-09 Created: 2024-01-09 Last updated: 2025-03-06Bibliographically approved
Khanna, P., Yadollahi, E., Leite, I., Björkman, M. & Smith, C. (2023). How do Humans take an Object from a Robot: Behavior changes observed in a User Study. In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction: . Paper presented at 11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023 (pp. 372-374). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>How do Humans take an Object from a Robot: Behavior changes observed in a User Study
Show others...
2023 (English)In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction, Association for Computing Machinery (ACM) , 2023, p. 372-374Conference paper, Published paper (Refereed)
Abstract [en]

To facilitate human-robot interaction and gain human trust, a robot should recognize and adapt to changes in human behavior. This work documents different human behaviors observed while taking objects from an interactive robot in an experimental study, categorized across two dimensions: pull force applied and handedness. We also present the changes observed in human behavior upon repeated interaction with the robot to take various objects.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
Keywords
HRI, Human-Robot Collaboration, Human-Robot Handovers
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-341674 (URN)10.1145/3623809.3623929 (DOI)001148034200049 ()2-s2.0-85180129229 (Scopus ID)
Conference
11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023
Note

Part of ISBN 9798400708244

QC 20231229

Available from: 2023-12-29 Created: 2023-12-29 Last updated: 2025-02-09Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-7091-0104

Search in DiVA

Show all publications