kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Early Detection of Human Handover Intentions in Human-Robot Collaboration: Comparing EEG, Gaze, and Hand Motion
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-1932-1595
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-2533-7868
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0001-5976-0993
KTH, School of Electrical Engineering and Computer Science (EECS), Centres, Centre for Autonomous Systems, CAS.ORCID iD: 0000-0003-2965-2953
Show others and affiliations
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Human-robot collaboration (HRC) relies on accurate and timely recognition of human intentions to ensure seamless interactions. Among common HRC tasks, human-to-robot object handovers have been studied extensively for planning the robot's actions during object reception, assuming the human intention for object handover. However, distinguishing handover intentions from other actions has received limited attention. Most research on handovers has focused on visually detecting motion trajectories, which often results in delays or false detections when trajectories overlap. This paper investigates whether human intentions for object handovers are reflected in non-movement-based physiological signals. We conduct a multimodal analysis comparing three data modalities: electroencephalogram (EEG), gaze, and hand-motion signals. Our study aims to distinguish between handover-intended human motions and non-handover motions in an HRC setting, evaluating each modality's performance in predicting and classifying these actions before and after human movement initiation. We develop and evaluate human intention detectors based on these modalities, comparing their accuracy and timing in identifying handover intentions. To the best of our knowledge, this is the first study to systematically develop and test intention detectors across multiple modalities within the same experimental context of human-robot handovers. Our analysis reveals that handover intention can be detected from all three modalities. Nevertheless, gaze signals are the earliest as well as the most accurate to classify the motion as intended for handover or non-handover.

Keywords [en]
Human-Robot Collaboration (HRC), Human-Robot Handovers, EEG, Gaze, Motion Analysis
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-360885DOI: 10.48550/arXiv.2502.11752OAI: oai:DiVA.org:kth-360885DiVA, id: diva2:1942364
Note

QC 20250305

Available from: 2025-03-04 Created: 2025-03-04 Last updated: 2025-03-06Bibliographically approved
In thesis
1. Adaptive Handovers for Enhanced Human-Robot Collaboration: A Human-Inspired Approach
Open this publication in new window or tab >>Adaptive Handovers for Enhanced Human-Robot Collaboration: A Human-Inspired Approach
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

As robots become more capable with technology, their presence in human environments is expected to increase, leading to more physical and social interactions between humans and robots. In these shared spaces, handovers—the act of transferring an object from one person to another—constitute a significant part of daily human interactions. This thesis focuses on enhancing human-robot interaction by drawing inspiration from human-to-human handovers.

In this thesis, we investigate forces in human handovers to formulate adaptive robot grip release strategies, specifically addressing when a robot should release an object as a human recipient begins to take it during a handover. We developed a data-driven grip release strategy based on a dataset of recorded human-human handovers, which has been experimentally validated in human-robot interactions. To refine this strategy for different object weights, we recorded additional handovers involving various weights, resulting in publicly available datasets and a weight-adpative grip release strategy. Further, this thesis also examines how object weight affects human motion during handovers, enabling robots to observe changes in human motion to estimate object weights and adapt their motions to convey changes in object weights during handovers.

Additionally, we investigate the use of non-touch modalities, such as EEG brain signals and gaze tracking, to discern human intentions during handovers, specifically differentiating between motions intended for handovers and those that are not. 

Lastly, we also explore how human-robot handovers can be used to resolve robotic failures by providing explanations for these failures and adapting the explanations based on human behavioral responses.

Abstract [sv]

När robotar blir mer kapabla med teknik förväntas deras närvaro i mänskliga miljöer öka, vilket leder till mer fysisk och social interaktion mellan människor och robotar. I dessa delade utrymmen utgör överlämningar – handlingen att överföra ett objekt från en person till en annan – en betydande del av den dagliga mänskliga interaktionen. Den här avhandlingen fokuserar på att förbättra interaktionen mellan människa och robot genom att hämta inspiration från överlämningar från människa till människa.

I det här examensarbetet undersöker vi krafter i mänskliga överlämningar för att formulera adaptiva robotgrepp-release-strategier, som specifikt tar upp när en robot ska släppa ett föremål när en mänsklig mottagare börjar ta det under en överlämning. Vi utvecklade en datadriven strategi för frigörande av grepp baserad på en datauppsättning av inspelade människa-människa överlämningar, som har experimentellt validerats i interaktioner mellan människa och robot. För att förfina denna strategi för olika objektvikter, spelade vi in ytterligare överlämningar som involverade olika vikter, vilket resulterade i allmänt tillgängliga datauppsättningar och en viktadpativ strategi för grepp-släpp. Vidare undersöker denna avhandling också hur objektvikt påverkar mänsklig rörelse under överlämningar, vilket gör det möjligt för robotar att observera förändringar i mänsklig rörelse för att uppskatta objektvikter och anpassa sina rörelser för att förmedla förändringar i objektvikter under överlämningar.

Dessutom undersöker vi användningen av icke-touch-modaliteter, såsom EEG-hjärnsignaler och blickspårning, för att urskilja mänskliga avsikter under överlämningar, specifikt skilja mellan rörelser avsedda för överlämningar och de som inte är det.

Slutligen undersöker vi också hur mänsklig-robot-överlämningar kan användas för att lösa robotfel genom att tillhandahålla förklaringar till dessa fel och anpassa förklaringarna baserat på mänskliga beteendesvar.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2025. p. xx, 130
Series
TRITA-EECS-AVL ; 2025:29
Keywords
Human-Robot Collaboration, Human-Robot Handovers, Adaptive Handovers, Robotic failures, Robotic Failure Explanation, Samarbete mellan människa och robot, Överlämningar av människor och robotar, Adaptiva överlämningar, Robotfel, Förklaring av robotfel.
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-360949 (URN)978-91-8106-216-8 (ISBN)
Public defence
2025-03-31, https://kth-se.zoom.us/j/66859470351, F3 (Flodis), Lindstedsvägen 26 & 28, KTH Campus, Stockholm, 14:00 (English)
Opponent
Supervisors
Note

QC 20250307

Available from: 2025-03-07 Created: 2025-03-06 Last updated: 2025-04-02Bibliographically approved

Open Access in DiVA

EarlyDetectionofHumanHandoverIntention(45778 kB)31 downloads
File information
File name FULLTEXT01.pdfFile size 45778 kBChecksum SHA-512
1ea8d531e900e5044b03409f468ad72eac3e3311ad6deb0f74811d8a0ae185c7390e10154f001461099c1584a166664c9b440e83efef650083f8264ede494c9e
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records

Khanna, ParagRajabi, NonaDemir Kanik, Sumeyra UmmuhanKragic, DanicaBjörkman, MårtenSmith, Christian

Search in DiVA

By author/editor
Khanna, ParagRajabi, NonaDemir Kanik, Sumeyra UmmuhanKragic, DanicaBjörkman, MårtenSmith, Christian
By organisation
Robotics, Perception and Learning, RPLCentre for Autonomous Systems, CAS
Robotics and automation

Search outside of DiVA

GoogleGoogle Scholar
Total: 31 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 611 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf