Endre søk
Link to record
Permanent link

Direct link
Alternativa namn
Publikasjoner (10 av 82) Visa alla publikasjoner
Khanna, P., Rajabi, N., Kanik, S. U. e., Kragic Jensfelt, D., Björkman, M. & Smith, C. (2026). Early detection of human handover intentions in human–robot collaboration: Comparing EEG, gaze, and hand motion. Robotics and Autonomous Systems, 196, Article ID 105244.
Åpne denne publikasjonen i ny fane eller vindu >>Early detection of human handover intentions in human–robot collaboration: Comparing EEG, gaze, and hand motion
Vise andre…
2026 (engelsk)Inngår i: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 196, artikkel-id 105244Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Human–robot collaboration (HRC) relies on accurate and timely recognition of human intentions to ensure seamless interactions. Among common HRC tasks, human-to-robot object handovers have been studied extensively for planning the robot's actions during object reception, assuming the human intention for object handover. However, distinguishing handover intentions from other actions has received limited attention. Most research on handovers has focused on visually detecting motion trajectories, which often results in delays or false detections when trajectories overlap. This paper investigates whether human intentions for object handovers are reflected in non-movement-based physiological signals. We conduct a multimodal analysis comparing three data modalities: electroencephalogram (EEG), gaze, and hand-motion signals. Our study aims to distinguish between handover-intended human motions and non-handover motions in an HRC setting, evaluating each modality's performance in predicting and classifying these actions before and after human movement initiation. We develop and evaluate human intention detectors based on these modalities, comparing their accuracy and timing in identifying handover intentions. To the best of our knowledge, this is the first study to systematically develop and test intention detectors across multiple modalities within the same experimental context of human–robot handovers. Our analysis reveals that handover intention can be detected from all three modalities. Nevertheless, gaze signals are the earliest as well as the most accurate to classify the motion as intended for handover or non-handover.

sted, utgiver, år, opplag, sider
Elsevier BV, 2026
Emneord
EEG, Gaze, Human–robot collaboration (HRC), Human–robot handovers, Motion analysis
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-373139 (URN)10.1016/j.robot.2025.105244 (DOI)2-s2.0-105021346666 (Scopus ID)
Merknad

QC 20251121

Tilgjengelig fra: 2025-11-21 Laget: 2025-11-21 Sist oppdatert: 2025-11-21bibliografisk kontrollert
Mohamed, Y., Lemaignan, S., Güneysu, A., Jensfelt, P. & Smith, C. (2025). Are You an Expert? Instruction Adaptation Using Multi-Modal Affect Detections with Thermal Imaging and Context. In: : . Paper presented at IEEE International Conference on Robot and Human Interactive Communication, Eindhoven University of Technology, Eindhoven, The Netherlands, Aug 25-29, 2025..
Åpne denne publikasjonen i ny fane eller vindu >>Are You an Expert? Instruction Adaptation Using Multi-Modal Affect Detections with Thermal Imaging and Context
Vise andre…
2025 (engelsk)Konferansepaper, Publicerat paper (Fagfellevurdert)
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-369162 (URN)
Konferanse
IEEE International Conference on Robot and Human Interactive Communication, Eindhoven University of Technology, Eindhoven, The Netherlands, Aug 25-29, 2025.
Tilgjengelig fra: 2025-08-29 Laget: 2025-08-29 Sist oppdatert: 2025-09-05bibliografisk kontrollert
Styrud, J., Iovino, M., Norrlöf, M., Björkman, M. & Smith, C. (2025). Automatic Behavior Tree Expansion with LLMs for Robotic Manipulation. In: 2025 IEEE International Conference on Robotics and Automation, ICRA 2025: . Paper presented at 2025 IEEE International Conference on Robotics and Automation, ICRA 2025, Atlanta, United States of America, May 19 2025 - May 23 2025 (pp. 1225-1232). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>Automatic Behavior Tree Expansion with LLMs for Robotic Manipulation
Vise andre…
2025 (engelsk)Inngår i: 2025 IEEE International Conference on Robotics and Automation, ICRA 2025, Institute of Electrical and Electronics Engineers (IEEE) , 2025, s. 1225-1232Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Robotic systems for manipulation tasks are increasingly expected to be easy to configure for new tasks or unpredictable environments, while keeping a transparent policy that is readable and verifiable by humans. We propose the method BEhavior TRee eXPansion with Large Language Models (BETR-XP-LLM) to dynamically and automatically expand and configure Behavior Trees as policies for robot control. The method utilizes an LLM to resolve errors outside the task planner's capabilities, both during planning and execution. We show that the method is able to solve a variety of tasks and failures and permanently update the policy to handle similar problems in the future.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2025
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-371379 (URN)10.1109/ICRA55743.2025.11127942 (DOI)2-s2.0-105016707385 (Scopus ID)
Konferanse
2025 IEEE International Conference on Robotics and Automation, ICRA 2025, Atlanta, United States of America, May 19 2025 - May 23 2025
Merknad

Part of ISBN 9798331541392

QC 20251010

Tilgjengelig fra: 2025-10-10 Laget: 2025-10-10 Sist oppdatert: 2025-10-10bibliografisk kontrollert
Iovino, M., Förster, J., Falco, P., Jen Chung, J., Siegwart, R. & Smith, C. (2025). Comparison between Behavior Trees and Finite State Machines. IEEE Transactions on Automation Science and Engineering, 22, 21098-21117
Åpne denne publikasjonen i ny fane eller vindu >>Comparison between Behavior Trees and Finite State Machines
Vise andre…
2025 (engelsk)Inngår i: IEEE Transactions on Automation Science and Engineering, ISSN 1545-5955, E-ISSN 1558-3783, Vol. 22, s. 21098-21117Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Behavior Trees (BTs) were first conceived in the computer games industry as a tool to model agent behavior, but they received interest also in the robotics community as an alternative policy design to Finite State Machines (FSMs). The advantages of BTs over FSMs had been highlighted in many works, but there is no thorough practical comparison of the two designs. Such a comparison is particularly relevant in the robotic industry, where FSMs have been the state-of-The-Art policy representation for robot control for many years. In this work we shed light on this matter by comparing how BTs and FSMs behave when controlling a robot in a mobile manipulation task. The comparison is made in terms of reactivity, modularity, readability, and design. We propose metrics for each of these properties, being aware that while some are tangible and objective, others are more subjective and implementation dependent. The practical comparison is performed in a simulation environment with validation on a real robot. We find that although the robot's behavior during task solving is independent on the policy representation, maintaining a BT rather than an FSM becomes easier as the task increases in complexity.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2025
Emneord
Behavior Trees, Collaborative Robotics, Finite State Machines, Mobile Manipulation, Robot Control
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-371275 (URN)10.1109/TASE.2025.3610090 (DOI)001579020600013 ()2-s2.0-105016561233 (Scopus ID)
Merknad

QC 20251013

Tilgjengelig fra: 2025-10-13 Laget: 2025-10-13 Sist oppdatert: 2025-10-13bibliografisk kontrollert
Mohamed, Y., Lemaignan, S., Güneysu, A., Jensfelt, P. & Smith, C. (2025). Context Matters: Understanding Socially Appropriate Affective Responses Via Sentence Embeddings. In: Social Robotics - 16th International Conference, ICSR + AI 2024, Proceedings: . Paper presented at 16th International Conference on Social Robotics, ICSR + AI 2024, Odense, Denmark, October 23-26, 2024 (pp. 78-91). Springer Nature
Åpne denne publikasjonen i ny fane eller vindu >>Context Matters: Understanding Socially Appropriate Affective Responses Via Sentence Embeddings
Vise andre…
2025 (engelsk)Inngår i: Social Robotics - 16th International Conference, ICSR + AI 2024, Proceedings, Springer Nature , 2025, s. 78-91Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

As AI systems increasingly engage in social interactions, comprehending human social dynamics is crucial. Affect recognition enables systems to respond appropriately to emotional nuances in social situations. However, existing multimodal approaches lack accounting for the social appropriateness of detected emotions within their contexts. This paper presents a novel methodology leveraging sentence embeddings to distinguish socially appropriate and inappropriate interactions for more context-aware AI systems. Our approach measures the semantic distance between facial expression descriptions and predefined reference points. We evaluate our method using a benchmark dataset and a real-world robot deployment in a library, combining GPT-4(V) for expression descriptions and ada-2 for sentence embeddings to detect socially inappropriate interactions. Our results underscore the importance of considering contextual factors for effective social interaction understanding through context-aware affect recognition, contributing to the development of socially intelligent AI capable of interpreting and responding to human affect appropriately.

sted, utgiver, år, opplag, sider
Springer Nature, 2025
Emneord
embeddings, human-robot interaction, machine learning, Social representation
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-362501 (URN)10.1007/978-981-96-3522-1_9 (DOI)001531722800009 ()2-s2.0-105002016733 (Scopus ID)
Konferanse
16th International Conference on Social Robotics, ICSR + AI 2024, Odense, Denmark, October 23-26, 2024
Merknad

Part of ISBN 9789819635214

QC 20250428

Tilgjengelig fra: 2025-04-16 Laget: 2025-04-16 Sist oppdatert: 2025-12-08bibliografisk kontrollert
Mohamed, Y., Séverin, L., Güneysu, A., Jensfelt, P. & Smith, C. (2025). Fusion in Context: A Multimodal Approach to Affective State Recognition. In: : . Paper presented at 34th IEEE International Conference on Robot and Human Interactive Communication, Eindhoven University of Technology, Eindhoven, The Netherlands, Aug 25-29, 2025..
Åpne denne publikasjonen i ny fane eller vindu >>Fusion in Context: A Multimodal Approach to Affective State Recognition
Vise andre…
2025 (engelsk)Konferansepaper, Publicerat paper (Fagfellevurdert)
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-369160 (URN)
Konferanse
34th IEEE International Conference on Robot and Human Interactive Communication, Eindhoven University of Technology, Eindhoven, The Netherlands, Aug 25-29, 2025.
Merknad

QC 20250905

Tilgjengelig fra: 2025-08-29 Laget: 2025-08-29 Sist oppdatert: 2025-09-05bibliografisk kontrollert
Khanna, P., Naoum, A., Yadollahi, E., Björkman, M. & Smith, C. (2025). REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations. In: Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction: . Paper presented at ACM/IEEE International Conference on Human-Robot Interaction, HRI, Melbourne, Australia, March 4-6, 2025 (pp. 1032-1036). IEEE
Åpne denne publikasjonen i ny fane eller vindu >>REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations
Vise andre…
2025 (engelsk)Inngår i: Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, IEEE , 2025, s. 1032-1036Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

This work presents REFLEX: Robotic Explanations to FaiLures and Human EXpressions, a comprehensive multimodal dataset capturing human reactions to robot failures and subsequent explanations in collaborative settings. It aims to facilitate research into human-robot interaction dynamics, addressing the need to study reactions to both initial failures and explanations, as well as the evolution of these reactions in long-term interactions. By providing rich, annotated data on human responses to different types of failures, explanation levels, and explanation varying strategies, the dataset contributes to the development of more robust, adaptive, and satisfying robotic systems capable of maintaining positive relationships with human collaborators, even during challenges like repeated failures

sted, utgiver, år, opplag, sider
IEEE, 2025
Emneord
Human Robot Interaction, Dataset, Robotic Failures, Explainable AI.
HSV kategori
Forskningsprogram
Datalogi
Identifikatorer
urn:nbn:se:kth:diva-360946 (URN)10.5555/3721488.3721616 (DOI)
Konferanse
ACM/IEEE International Conference on Human-Robot Interaction, HRI, Melbourne, Australia, March 4-6, 2025
Merknad

QC 20250310

Tilgjengelig fra: 2025-03-06 Laget: 2025-03-06 Sist oppdatert: 2025-03-10bibliografisk kontrollert
Khanna, P., Naoum, A., Yadollahi, E., Björkman, M. & Smith, C. (2025). REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations. In: HRI 2025 - Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction: . Paper presented at 20th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2025, Melbourne, Australia, March 4-6, 2025 (pp. 1032-1036). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>REFLEX Dataset: A Multimodal Dataset of Human Reactions to Robot Failures and Explanations
Vise andre…
2025 (engelsk)Inngår i: HRI 2025 - Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Institute of Electrical and Electronics Engineers (IEEE) , 2025, s. 1032-1036Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

This work presents REFLEX: Robotic Explanations to FaiLures and Human EXpressions, a comprehensive multimodal dataset capturing human reactions to robot failures and subsequent explanations in collaborative settings. It aims to facilitate research into human-robot interaction dynamics, addressing the need to study reactions to both initial failures and explanations, as well as the evolution of these reactions in long-term interactions. By providing rich, annotated data on human responses to different types of failures, explanation levels, and explanation varying strategies, the dataset contributes to the development of more robust, adaptive, and satisfying robotic systems capable of maintaining positive relationships with human collaborators, even during challenges like repeated failures.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2025
Emneord
Dataset, Explainable AI, Human Robot Interaction, Robotic Failures
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-363769 (URN)10.1109/HRI61500.2025.10974185 (DOI)2-s2.0-105004877597 (Scopus ID)
Konferanse
20th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2025, Melbourne, Australia, March 4-6, 2025
Merknad

Part of ISBN 9798350378931

QC 20250526

Tilgjengelig fra: 2025-05-21 Laget: 2025-05-21 Sist oppdatert: 2025-05-26bibliografisk kontrollert
Styrud, J., Mayr, M., Hellsten, E., Krueger, V. & Smith, C. (2024). BeBOP - Combining Reactive Planning and Bayesian Optimization to Solve Robotic Manipulation Tasks. In: 2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024): . Paper presented at IEEE International Conference on Robotics and Automation (ICRA), MAY 13-17, 2024, Yokohama, JAPAN (pp. 16459-16466). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>BeBOP - Combining Reactive Planning and Bayesian Optimization to Solve Robotic Manipulation Tasks
Vise andre…
2024 (engelsk)Inngår i: 2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024), Institute of Electrical and Electronics Engineers (IEEE) , 2024, s. 16459-16466Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Robotic systems for manipulation tasks are increasingly expected to be easy to configure for new tasks. While in the past, robot programs were often written statically and tuned manually, the current, faster transition times call for robust, modular and interpretable solutions that also allow a robotic system to learn how to perform a task. We propose the method Behavior-based Bayesian Optimization and Planning (BeBOP) that combines two approaches for generating behavior trees: we build the structure using a reactive planner and learn specific parameters with Bayesian optimization. The method is evaluated on a set of robotic manipulation benchmarks and is shown to outperform state-of-the-art reinforcement learning algorithms by being up to 46 times faster while simultaneously being less dependent on reward shaping. We also propose a modification to the uncertainty estimate for the random forest surrogate models that drastically improves the results.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2024
Serie
IEEE International Conference on Robotics and Automation ICRA, ISSN 1050-4729
Emneord
Behavior Trees, Bayesian Optimization, Task Planning, Robotic manipulation
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-360969 (URN)10.1109/ICRA57147.2024.10611468 (DOI)001369728005084 ()2-s2.0-85190848983 (Scopus ID)
Konferanse
IEEE International Conference on Robotics and Automation (ICRA), MAY 13-17, 2024, Yokohama, JAPAN
Merknad

Part of ISBN 979-8-3503-8458-1, 979-8-3503-8457-4

QC 20250310

Tilgjengelig fra: 2025-03-10 Laget: 2025-03-10 Sist oppdatert: 2025-03-10bibliografisk kontrollert
Hallen, M., Iovino, M., Sander-Tavallaey, S. & Smith, C. (2024). Behavior Trees in Industrial Applications: A Case Study in Underground Explosive Charging. In: 2024 IEEE 20th International Conference on Automation Science and Engineering, CASE 2024: . Paper presented at 20th IEEE International Conference on Automation Science and Engineering, CASE 2024, Bari, Italy, Aug 28 2024 - Sep 1 2024 (pp. 156-162). Institute of Electrical and Electronics Engineers (IEEE)
Åpne denne publikasjonen i ny fane eller vindu >>Behavior Trees in Industrial Applications: A Case Study in Underground Explosive Charging
2024 (engelsk)Inngår i: 2024 IEEE 20th International Conference on Automation Science and Engineering, CASE 2024, Institute of Electrical and Electronics Engineers (IEEE) , 2024, s. 156-162Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In industrial applications Finite State Machines (FSMs) are often used to implement decision making policies for autonomous systems. In recent years, the use of Behavior Trees (BT) as an alternative policy representation has gained considerable attention. The benefits of using BTs over FSMs are modularity and reusability, enabling a system that is easy to extend and modify. However, there exists few published studies on successful implementations of BTs for industrial applications. This paper contributes with the lessons learned from implementing BTs in a complex industrial use case, where a robotic system assembles explosive charges and places them in holes on the rock face. The main result of the paper is that even if it is possible to model the entire system as a BT, combining BTs with FSMs can increase the readability and maintainability of the system. The benefit of such combination is remarked especially in the use case studied in this paper, where the full system cannot run autonomously but human supervision and feedback are needed.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2024
Emneord
behavior Trees, Behavior Trees in Robotics Applications, Finite State Machines, Modularity
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-367272 (URN)10.1109/CASE59546.2024.10711822 (DOI)001361783100023 ()2-s2.0-85206356835 (Scopus ID)
Konferanse
20th IEEE International Conference on Automation Science and Engineering, CASE 2024, Bari, Italy, Aug 28 2024 - Sep 1 2024
Merknad

Part of ISBN 9798350358513

QC 20250717

Tilgjengelig fra: 2025-07-17 Laget: 2025-07-17 Sist oppdatert: 2025-07-17bibliografisk kontrollert
Organisasjoner
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0003-2078-8854