kth.sePublications
Change search
Link to record
Permanent link

Direct link
Publications (7 of 7) Show all publications
Zhang, Y., Rajabi, N., Taleb, F., Matviienko, A., Ma, Y., Björkman, M. & Kragic, D. (2025). Mind Meets Robots: A Review of EEG-Based Brain-Robot Interaction Systems. International Journal of Human-Computer Interaction, 1-32
Open this publication in new window or tab >>Mind Meets Robots: A Review of EEG-Based Brain-Robot Interaction Systems
Show others...
2025 (English)In: International Journal of Human-Computer Interaction, ISSN 1044-7318, E-ISSN 1532-7590, p. 1-32Article in journal (Refereed) Published
Abstract [en]

Brain-robot interaction (BRI) empowers individuals to control (semi-)automated machines through brain activity, either passively or actively. In the past decade, BRI systems have advanced significantly, primarily leveraging electroencephalogram (EEG) signals. This article presents an up-to-date review of 87 curated studies published between 2018 and 2023, identifying the research landscape of EEG-based BRI systems. The review consolidates methodologies, interaction modes, application contexts, system evaluation, existing challenges, and future directions in this domain. Based on our analysis, we propose a BRI system model comprising three entities: Brain, Robot, and Interaction, depicting their internal relationships. We especially examine interaction modes between human brains and robots, an aspect not yet fully explored. Within this model, we scrutinize and classify current research, extract insights, highlight challenges, and offer recommendations for future studies. Our findings provide a structured design space for human-robot interaction (HRI), informing the development of more efficient BRI frameworks.

Place, publisher, year, edition, pages
Informa UK Limited, 2025
Keywords
EEG based, brain-robot interaction, interaction mode, comprehensive review
National Category
Vehicle and Aerospace Engineering
Identifiers
urn:nbn:se:kth:diva-361866 (URN)10.1080/10447318.2025.2464915 (DOI)001446721000001 ()2-s2.0-105000309480 (Scopus ID)
Note

QC 20250402

Available from: 2025-04-02 Created: 2025-04-02 Last updated: 2025-04-02Bibliographically approved
Taleb, F., Vasco, M., Rajabi, N., Björkman, M. & Kragic, D. (2024). Challenging Deep Learning Methods for EEG Signal Denoising under Data Corruption. In: 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2024 - Proceedings: . Paper presented at 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2024, Orlando, United States of America, Jul 15 2024 - Jul 19 2024. Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Challenging Deep Learning Methods for EEG Signal Denoising under Data Corruption
Show others...
2024 (English)In: 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2024 - Proceedings, Institute of Electrical and Electronics Engineers (IEEE) , 2024Conference paper, Published paper (Refereed)
Abstract [en]

Capturing informative electroencephalogram (EEG) signals is a challenging task due to the presence of noise (e.g., due to human movement). In extreme cases, data recordings from specific electrodes (channels) can become corrupted and entirely devoid of information. Motivated by recent work on deep-learning-based approaches for EEG signal denoising, we present the first benchmark study on the performance of EEG signal denoising methods in the presence of corrupted channels. We design our study considering a wide variety of datasets, models, and evaluation tasks. Our results highlight the need for assessing the performance of EEG deep-learning models across a broad suite of datasets, as provided by our benchmark.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2024
Keywords
data corruption, deep learning, EEG, signal denoising, signal noise
National Category
Signal Processing Computer Sciences
Identifiers
urn:nbn:se:kth:diva-358866 (URN)10.1109/EMBC53108.2024.10782132 (DOI)40039138 (PubMedID)2-s2.0-85214969123 (Scopus ID)
Conference
46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2024, Orlando, United States of America, Jul 15 2024 - Jul 19 2024
Note

Part of ISBN 9798350371499]

QC 20250128

Available from: 2025-01-23 Created: 2025-01-23 Last updated: 2025-05-27Bibliographically approved
Wallace, B., Van Otterdijk, M., Zhang, Y., Rajabi, N., Marin-Bucio, D., Kragic, D. & Torresen, J. (2024). Imitation or Innovation? Translating Features of Expressive Motion from Humans to Robots. In: HAI 2024 - Proceedings of the 12th International Conference on Human-Agent Interaction: . Paper presented at 12th International Conference on Human-Agent Interaction, HAI 2024, Swansea, United Kingdom of Great Britain and Northern Ireland, Nov 24 2024 - Nov 27 2024 (pp. 296-304). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Imitation or Innovation? Translating Features of Expressive Motion from Humans to Robots
Show others...
2024 (English)In: HAI 2024 - Proceedings of the 12th International Conference on Human-Agent Interaction, Association for Computing Machinery (ACM) , 2024, p. 296-304Conference paper, Published paper (Refereed)
Abstract [en]

Expressive robot motion can help establish acceptance of this technology in everyday life, but understanding what makes movement expressive is a complex and multifaceted task. This paper presents the results of an online study with 46 participants, it aims to explore how people perceive and interpret the expressive qualities of human movement and how they envision the translation of their description into an imagined non-humanoid, quadrupedal robot. Through a qualitative analysis of responses, we conceptualize three themes: their understanding of intent, their interpretations of movement qualities, and finally, their translation from human to robot movement. Respondents' descriptions of their initial understanding of the performer's intent fall into two modes, bio-mechanical and narrative. We illustrate their interpretations of movement qualities through four strategies: movement features as kinematic indicators, intent indicators, attributed context, and perceived internal states. Lastly, we observe their translation from human to robot movement, with a particular focus on respondents' use of kinaesthetic empathy and anthropomorphism. Our findings aim to support a bottom-up approach, using users' general knowledge for designing expressive robot motion.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2024
Keywords
Expressive movement, Human-robot interaction, Nonverbal communication
National Category
Human Computer Interaction Robotics and automation
Identifiers
urn:nbn:se:kth:diva-359255 (URN)10.1145/3687272.3688302 (DOI)001436563800034 ()2-s2.0-85215528129 (Scopus ID)
Conference
12th International Conference on Human-Agent Interaction, HAI 2024, Swansea, United Kingdom of Great Britain and Northern Ireland, Nov 24 2024 - Nov 27 2024
Note

Part of ISBN 9798400708244

QC 20250131

Available from: 2025-01-29 Created: 2025-01-29 Last updated: 2025-04-28Bibliographically approved
Xia, H., Zhang, Y., Rajabi, N., Taleb, F., Yang, Q., Kragic, D. & Li, Z. (2024). Shaping high-performance wearable robots for human motor and sensory reconstruction and enhancement. Nature Communications, 15(1), Article ID 1760.
Open this publication in new window or tab >>Shaping high-performance wearable robots for human motor and sensory reconstruction and enhancement
Show others...
2024 (English)In: Nature Communications, E-ISSN 2041-1723, Vol. 15, no 1, article id 1760Article in journal (Refereed) Published
Abstract [en]

Most wearable robots such as exoskeletons and prostheses can operate with dexterity, while wearers do not perceive them as part of their bodies. In this perspective, we contend that integrating environmental, physiological, and physical information through multi-modal fusion, incorporating human-in-the-loop control, utilizing neuromuscular interface, employing flexible electronics, and acquiring and processing human-robot information with biomechatronic chips, should all be leveraged towards building the next generation of wearable robots. These technologies could improve the embodiment of wearable robots. With optimizations in mechanical structure and clinical training, the next generation of wearable robots should better facilitate human motor and sensory reconstruction and enhancement.

Place, publisher, year, edition, pages
Springer Nature, 2024
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-344345 (URN)10.1038/s41467-024-46249-0 (DOI)38409128 (PubMedID)2-s2.0-85186407087 (Scopus ID)
Note

QC 20240315

Available from: 2024-03-13 Created: 2024-03-13 Last updated: 2025-02-09Bibliographically approved
Rajabi, N., Khanna, P., Kanik, S. U. D., Yadollahi, E., Vasco, M., Björkman, M., . . . Kragic, D. (2023). Detecting the Intention of Object Handover in Human-Robot Collaborations: An EEG Study. In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN: . Paper presented at 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA (pp. 549-555). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Detecting the Intention of Object Handover in Human-Robot Collaborations: An EEG Study
Show others...
2023 (English)In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 549-555Conference paper, Published paper (Refereed)
Abstract [en]

Human-robot collaboration (HRC) relies on smooth and safe interactions. In this paper, we focus on the human-to-robot handover scenario, where the robot acts as a taker. We investigate the feasibility of detecting the intention of a human-to-robot handover action through the analysis of electroencephalogram (EEG) signals. Our study confirms that temporal patterns in EEG signals provide information about motor planning and can be leveraged to predict the likelihood of an individual executing a motor task with an average accuracy of 94.7%. We also suggest the effectiveness of the time-frequency features of EEG signals in the final second prior to the movement for distinguishing between handover action and other actions. Furthermore, we classify human intentions for different tasks based on time-frequency representations of pre-movement EEG signals and achieve an average accuracy of 63.5% for contrasting every two tasks against each other. The result encourages the possibility of using EEG signals to detect human handover intention in HRC tasks.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Series
IEEE RO-MAN, ISSN 1944-9445
National Category
Robotics and automation
Identifiers
urn:nbn:se:kth:diva-342040 (URN)10.1109/RO-MAN57019.2023.10309426 (DOI)001108678600078 ()2-s2.0-85186991854 (Scopus ID)
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA
Note

Part of proceedings ISBN 979-8-3503-3670-2

QC 20240110

Available from: 2024-01-10 Created: 2024-01-10 Last updated: 2025-02-09Bibliographically approved
Rajabi, N., Chernik, C., Reichlin, A., Taleb, F., Vasco, M., Ghadirzadeh, A., . . . Kragic, D. (2023). Mental Face Image Retrieval Based on a Closed-Loop Brain-Computer Interface. In: Augmented Cognition: 17th International Conference, AC 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Proceedings. Paper presented at 17th International Conference on Augmented Cognition, AC 2023, held as part of the 25th International Conference on Human-Computer Interaction, HCII 2023, Copenhagen, Denmark, Jul 23 2023 - Jul 28 2023 (pp. 26-45). Springer Nature
Open this publication in new window or tab >>Mental Face Image Retrieval Based on a Closed-Loop Brain-Computer Interface
Show others...
2023 (English)In: Augmented Cognition: 17th International Conference, AC 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Proceedings, Springer Nature , 2023, p. 26-45Conference paper, Published paper (Refereed)
Abstract [en]

Retrieval of mental images from measured brain activity may facilitate communication, especially when verbal or muscular communication is impossible or inefficient. The existing work focuses mostly on retrieving the observed visual stimulus while our interest is on retrieving the imagined mental image. We present a closed-loop BCI framework to retrieve mental images of human faces. We utilize EEG signals as binary feedback to determine the relevance of an image to the target mental image. We employ the feedback to traverse the latent space of a generative model to propose new images closer to the actual target image. We evaluate the proposed framework on 13 volunteers. Unlike previous studies, we do not restrict the possible attributes of the resulting images to predefined semantic classes. Subjective and objective tests validate the ability of our model to retrieve face images similar to the actual target mental images.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Brain-Computer Interface, EEG, Generative Models, Mental Image Retrieval
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:kth:diva-337884 (URN)10.1007/978-3-031-35017-7_3 (DOI)001286423000003 ()2-s2.0-85171440140 (Scopus ID)
Conference
17th International Conference on Augmented Cognition, AC 2023, held as part of the 25th International Conference on Human-Computer Interaction, HCII 2023, Copenhagen, Denmark, Jul 23 2023 - Jul 28 2023
Note

Part of ISBN 9783031350160

QC 20231010

Available from: 2023-10-10 Created: 2023-10-10 Last updated: 2025-02-07Bibliographically approved
Khanna, P., Rajabi, N., Demir Kanik, S. U., Kragic, D., Björkman, M. & Smith, C.Early Detection of Human Handover Intentions in Human-Robot Collaboration: Comparing EEG, Gaze, and Hand Motion.
Open this publication in new window or tab >>Early Detection of Human Handover Intentions in Human-Robot Collaboration: Comparing EEG, Gaze, and Hand Motion
Show others...
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Human-robot collaboration (HRC) relies on accurate and timely recognition of human intentions to ensure seamless interactions. Among common HRC tasks, human-to-robot object handovers have been studied extensively for planning the robot's actions during object reception, assuming the human intention for object handover. However, distinguishing handover intentions from other actions has received limited attention. Most research on handovers has focused on visually detecting motion trajectories, which often results in delays or false detections when trajectories overlap. This paper investigates whether human intentions for object handovers are reflected in non-movement-based physiological signals. We conduct a multimodal analysis comparing three data modalities: electroencephalogram (EEG), gaze, and hand-motion signals. Our study aims to distinguish between handover-intended human motions and non-handover motions in an HRC setting, evaluating each modality's performance in predicting and classifying these actions before and after human movement initiation. We develop and evaluate human intention detectors based on these modalities, comparing their accuracy and timing in identifying handover intentions. To the best of our knowledge, this is the first study to systematically develop and test intention detectors across multiple modalities within the same experimental context of human-robot handovers. Our analysis reveals that handover intention can be detected from all three modalities. Nevertheless, gaze signals are the earliest as well as the most accurate to classify the motion as intended for handover or non-handover.

Keywords
Human-Robot Collaboration (HRC), Human-Robot Handovers, EEG, Gaze, Motion Analysis
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-360885 (URN)10.48550/arXiv.2502.11752 (DOI)
Note

QC 20250305

Available from: 2025-03-04 Created: 2025-03-04 Last updated: 2025-03-06Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2533-7868

Search in DiVA

Show all publications