Multimodal data fusion framework enhanced robot-assisted minimally invasive surgeryShow others and affiliations
2022 (English)In: Transactions of the Institute of Measurement and Control, ISSN 0142-3312, E-ISSN 1477-0369, Vol. 44, no 4, p. 735-743Article in journal (Refereed) Published
Abstract [en]
The generous application of robot-assisted minimally invasive surgery (RAMIS) promotes human-machine interaction (HMI). Identifying various behaviors of doctors can enhance the RAMIS procedure for the redundant robot. It bridges intelligent robot control and activity recognition strategies in the operating room, including hand gestures and human activities. In this paper, to enhance identification in a dynamic situation, we propose a multimodal data fusion framework to provide multiple information for accuracy enhancement. Firstly, a multi-sensors based hardware structure is designed to capture varied data from various devices, including depth camera and smartphone. Furthermore, in different surgical tasks, the robot control mechanism can shift automatically. The experimental results evaluate the efficiency of developing the multimodal framework for RAMIS by comparing it with a single sensor system. Implementing the KUKA LWR4+ in a surgical robot environment indicates that the surgical robot systems can work with medical staff in the future.
Place, publisher, year, edition, pages
SAGE Publications , 2022. Vol. 44, no 4, p. 735-743
Keywords [en]
event-based control, human activity recognition, minimally invasive surgery, Multimodal data fusion, redundant manipulator
National Category
Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-335694DOI: 10.1177/0142331220984350ISI: 000682757500001Scopus ID: 2-s2.0-85099569459OAI: oai:DiVA.org:kth-335694DiVA, id: diva2:1795050
Note
QC 20230907
2023-09-072023-09-072025-02-09Bibliographically approved