Open this publication in new window or tab >>Show others...
2025 (English)In: Human Centric Smart Manufacturing Towards Industry 5 0, Springer Nature , 2025, p. 281-304Chapter in book (Other academic)
Abstract [en]
The conventional automation approach has shown bottlenecks in the era of component assembly. What could be automated has been automated in some high tech industrial production, leaving manual work performed by humans. To achieve ergonomic working environments and better productivity, human–robot collabora tion has been adopted for this purpose through combining the strength, accuracy and repeatability of robots with adaptability, high-level cognition, and flexibility. A reliable human–robot collaborative setting should be supported by dynamically updated and precise models. For this purpose, the digital twin can realise the digital representation of physical collaborative settings through simulation modelling and data synchronisation but is limited by communication delay and constraints. This chapter will develop a digital twin-enabled approach to human–robot collaborative assembly. Within this approach, a sensor-driven 3D modelling of the physical devices of interest is developed to realise the physical-to-digital transformation of human–robot workcell, and a Wise-ShopFloor-based platform enabled by sensor data is used to develop a digital twin model of the physical human–robot workcell. Then, function blocks with embedded algorithms are used for assembly planning, decision making and robot control, and a time-ahead execution and planning approach is developed for reliable human–robot collaborative assembly. Finally, the performance of the developed system is demonstrated by a case study of a partial car engine assembly.
Place, publisher, year, edition, pages
Springer Nature, 2025
Keywords
Assembly, Digital twin, Robot
National Category
Production Engineering, Human Work Science and Ergonomics
Identifiers
urn:nbn:se:kth:diva-368723 (URN)10.1007/978-3-031-82170-7_12 (DOI)2-s2.0-105012012683 (Scopus ID)
Note
Part of ISBN 9783031821691, 9783031821707
QC 20250820
2025-08-202025-08-202025-08-20Bibliographically approved