Toward Proactive Human-Robot Collaborative Assembly: A Multimodal Transfer-Learning-Enabled Action Prediction Approach
2022 (English)In: IEEE Transactions on Industrial Electronics, ISSN 0278-0046, E-ISSN 1557-9948, Vol. 69, no 8, p. 8579-8588Article in journal (Refereed) Published
Abstract [en]
Human-robot collaborative assembly (HRCA) is vital for achieving high-level flexible automation for mass personalization in today's smart factories. However, existing works in both industry and academia mainly focus on the adaptive robot planning, while seldom consider human operator's intentions in advance. Hence, it hinders the HRCA transition toward a proactive manner. To overcome the bottleneck, this article proposes a multimodal transfer-learning-enabled action prediction approach, serving as the prerequisite to ensure the proactive HRCA. First, a multimodal intelligence-based action recognition approach is proposed to predict ongoing human actions by leveraging the visual stream and skeleton stream with short-time input frames. Second, a transfer-learning-enabled model is adapted to transfer learnt knowledge from daily activities to industrial assembly operations rapidly for online operator intention analysis. Third, a dynamic decision-making mechanism, including robotic decision and motion control, is described to allow mobile robots to assist operators in a proactive manner. Finally, an aircraft bracket assembly task is demonstrated in the laboratory environment, and the comparative study result shows that the proposed approach outperforms other state-of-the-art ones for efficient action prediction.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2022. Vol. 69, no 8, p. 8579-8588
Keywords [en]
Robots, Three-dimensional displays, Collaboration, Service robots, Skeleton, Videos, Visualization, Action recognition, human-robot collaboration, multimodal intelligence, transfer learning
National Category
Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-310197DOI: 10.1109/TIE.2021.3105977ISI: 000764880700100Scopus ID: 2-s2.0-85114652119OAI: oai:DiVA.org:kth-310197DiVA, id: diva2:1649543
Note
QC 20220404
2022-04-042022-04-042025-02-09Bibliographically approved