kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Multimodal Data-Driven Robot Control for Human-Robot Collaborative Assembly
KTH, School of Industrial Engineering and Management (ITM), Production Engineering, Sustainable Production Systems.ORCID iD: 0000-0002-1909-0507
KTH, School of Industrial Engineering and Management (ITM), Production Engineering, Sustainable Production Systems.ORCID iD: 0000-0001-8679-8049
KTH, School of Industrial Engineering and Management (ITM), Production Engineering, Sustainable Production Systems.ORCID iD: 0000-0001-9694-0483
2022 (English)In: Journal of manufacturing science and engineering, ISSN 1087-1357, E-ISSN 1528-8935, Vol. 144, no 5, article id 051012Article in journal (Refereed) Published
Abstract [en]

In human-robot collaborative assembly, leveraging multimodal commands for intuitive robot control remains a challenge from command translation to efficient collaborative operations. This article investigates multimodal data-driven robot control for human-robot collaborative assembly. Leveraging function blocks, a programming-free human-robot interface is designed to fuse multimodal human commands that accurately trigger defined robot control modalities. Deep learning is explored to develop a command classification system for low-latency and high-accuracy robot control, in which a spatial-temporal graph convolutional network is developed for a reliable and accurate translation of brainwave command phrases into robot commands. Then, multimodal data-driven high-level robot control during assembly is facilitated by the use of event-driven function blocks. The high-level commands serve as triggering events to algorithms execution of fine robot manipulation and assembly feature-based collaborative assembly. Finally, a partial car engine assembly deployed to a robot team is chosen as a case study to demonstrate the effectiveness of the developed system.

Place, publisher, year, edition, pages
ASME International , 2022. Vol. 144, no 5, article id 051012
Keywords [en]
robot, assembly, multimodal data, human-robot collaboration, brain robotics
National Category
Production Engineering, Human Work Science and Ergonomics
Identifiers
URN: urn:nbn:se:kth:diva-311282DOI: 10.1115/1.4053806ISI: 000776279600011Scopus ID: 2-s2.0-85144601043OAI: oai:DiVA.org:kth-311282DiVA, id: diva2:1653507
Note

QC 20220422

Available from: 2022-04-22 Created: 2022-04-22 Last updated: 2023-06-08Bibliographically approved
In thesis
1. Multimodal Human-Robot Collaboration in Assembly
Open this publication in new window or tab >>Multimodal Human-Robot Collaboration in Assembly
2022 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Human-robot collaboration (HRC) envisioned for factories of the future would require close physical collaboration between humans and robots in safe and shared working environments with enhanced efficiency and flexibility. The PhD study aims for multimodal human-robot collaboration in assembly. For this purpose, various modalities controlled by high-level human commands are adopted to facilitate multimodal robot control in assembly and to support efficient HRC. Voice commands, as a commonly used communication channel, are firstly considered and adopted to control robots. Also, hand gestures work as nonverbal commands that often accompany voice instructions, and are used for robot control, specifically for gripper control in robotic assembly. Algorithms are developed to train and identify the commands so that the voice and hand gesture instructions are associated with valid robot control commands at the controller level. A sensorless haptics modality is developed to allow human operators to haptically control robots without using any external sensors. Within such context, an accurate dynamic model of the robot (within both the pre-sliding and sliding regimes) and an adaptive admittance observer are combined for reliable haptic robot control. In parallel,  brainwaves work as an emerging communication modality and are used for adaptive robot control during seamless assembly, especially in noisy environments with unreliable voice recognition or when an operator is occupied with other tasks and unable to make gestures. Deep learning is explored to develop a robust brainwave classification system for high-accuracy robot control, and the brainwaves act as macro commands to trigger pre-defined function blocks that in turn provide micro control for robots in collaborative assembly. Brainwaves offer multimodal support to HRC assembly, as an alternative to haptics, auditory and gesture commands. Next, a multimodal data-driven control approach to HRC assembly assisted by event-driven function blocks is explored to facilitate collaborative assembly and adaptive robot control. The proposed approaches and system design are analysed and validated through experiments of a partial car engine assembly. Finally, conclusions and future directions are given.

Abstract [sv]

Samarbete mellan människa och robot (HRC) i framtidens fabriker kräver en nära fysisk samverkan mellan människor och robotar i säkra och delade arbetsmiljöer, för ökad effektivitet och flexibilitet. Doktorandstudien syftar till multimodalt samarbete mellan människa och robot vid montering. För detta ändamål används olika modaliteter som styrs av mänskliga kommandon på hög nivå för att stödja effektiv HRC och underlätta robotstyrning vid montering. Röstkommandon, som är en vanlig kommunikationskanal, används i första hand för att styra roboten. Handgester för icke-verbala kommandon åtföljer ofta röstinstruktioner och används för robotstyrning, speciellt för gripkontroll vid robotmontering. Algoritmer har utvecklats för att träna och identifiera kommandona så att röst- och handgestinstruktionerna associeras med giltiga robotkontrollkommandon på styrenhetsnivå. En sensorlös haptikmodalitet har utvecklats för att tillåta mänskliga operatörer att haptiskt styra robotar utan att använda några externa sensorer. I ett sådant sammanhang kombineras en exakt dynamisk modell av roboten (inom både glid- och förglidningsregimer) och en adaptiv inträdesobservatör för tillförlitlig haptisk robotkontroll. Parallellt är hjärnvågor en framväxande kommunikationsmodalitet som används för adaptiv robotstyrning under sömlös montering, särskilt i bullriga miljöer med opålitlig röstigenkänning eller när en operatör är upptagen med andra uppgifter och inte kan göra gester. Maskininlärning, Deep learning, utforskas för att utveckla ett robust hjärnvågsklassificeringssystem för robotstyrning med hög noggrannhet, och hjärnvågorna fungerar som makrokommandon för att aktivera fördefinierade funktionsblock som i sin tur ger mikrokontroll för robotar i kollaborativ montering. Hjärnvågorna ger ett multimodalt stöd till HRC-montering, som ett alternativ till haptik, hörsel- och gestkommandon. Därefter utforskas en multimodal datadriven kontrollmetod för HRC-montering med hjälp av händelsestyrda funktionsblock för att underlätta samverkande montering och adaptiv robotstyrning. De föreslagna tillvägagångssätten och systemdesignen analyseras och valideras genom experiment på ett delmontage av en bilmotor. Slutligen presenteras slutsatser och framtida riktningar.

Place, publisher, year, edition, pages
Brinellvägen 68, 114 28 Stockholm, Sweden: KTH Royal Institute of Technology, 2022. p. 118
Series
TRITA-ITM-AVL ; 2022:12
Keywords
Robotics, Assembly, Human-robot collaboration, Multimodal control, Function block
National Category
Production Engineering, Human Work Science and Ergonomics
Research subject
Production Engineering
Identifiers
urn:nbn:se:kth:diva-311425 (URN)978-91-8040-215-6 (ISBN)
Public defence
2022-05-20, https://kth-se.zoom.us/j/68935599845, Stockholm, 09:00 (English)
Opponent
Supervisors
Available from: 2022-04-28 Created: 2022-04-27 Last updated: 2022-12-19Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Liu, SichaoWang, LihuiWang, Xi Vincent

Search in DiVA

By author/editor
Liu, SichaoWang, LihuiWang, Xi Vincent
By organisation
Sustainable Production Systems
In the same journal
Journal of manufacturing science and engineering
Production Engineering, Human Work Science and Ergonomics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 165 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf