kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Detecting the Intention of Object Handover in Human-Robot Collaborations: An EEG Study
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-2533-7868
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-1932-1595
Ericsson Res, Stockholm, Sweden..
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0001-7091-0104
Show others and affiliations
2023 (English)In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 549-555Conference paper, Published paper (Refereed)
Abstract [en]

Human-robot collaboration (HRC) relies on smooth and safe interactions. In this paper, we focus on the human-to-robot handover scenario, where the robot acts as a taker. We investigate the feasibility of detecting the intention of a human-to-robot handover action through the analysis of electroencephalogram (EEG) signals. Our study confirms that temporal patterns in EEG signals provide information about motor planning and can be leveraged to predict the likelihood of an individual executing a motor task with an average accuracy of 94.7%. We also suggest the effectiveness of the time-frequency features of EEG signals in the final second prior to the movement for distinguishing between handover action and other actions. Furthermore, we classify human intentions for different tasks based on time-frequency representations of pre-movement EEG signals and achieve an average accuracy of 63.5% for contrasting every two tasks against each other. The result encourages the possibility of using EEG signals to detect human handover intention in HRC tasks.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. p. 549-555
Series
IEEE RO-MAN, ISSN 1944-9445
National Category
Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-342040DOI: 10.1109/RO-MAN57019.2023.10309426ISI: 001108678600078Scopus ID: 2-s2.0-85186991854OAI: oai:DiVA.org:kth-342040DiVA, id: diva2:1826025
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA
Note

Part of proceedings ISBN 979-8-3503-3670-2

QC 20240110

Available from: 2024-01-10 Created: 2024-01-10 Last updated: 2025-02-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Rajabi, NonaKhanna, ParagYadollahi, ElmiraBjörkman, MårtenSmith, ChristianKragic, Danica

Search in DiVA

By author/editor
Rajabi, NonaKhanna, ParagYadollahi, ElmiraBjörkman, MårtenSmith, ChristianKragic, Danica
By organisation
Robotics, Perception and Learning, RPL
Robotics and automation

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 81 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf