Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Active collision avoidance for human–robot collaboration driven by vision sensors
KTH, School of Industrial Engineering and Management (ITM), Production Engineering. (Production Engineering Department)
University of Skövde.
KTH, School of Industrial Engineering and Management (ITM), Production Engineering. (Production Engineering Department)ORCID iD: 0000-0001-8679-8049
2016 (English)In: International journal of computer integrated manufacturing (Print), ISSN 0951-192X, E-ISSN 1362-3052, p. 1-11Article in journal (Refereed) Published
Abstract [en]

Establishing safe human–robot collaboration is an essential factor for improving efficiency and flexibility in today’s manufacturing environment. Targeting safety in human–robot collaboration, this paper reports a novel approach for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s existence and location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.

Place, publisher, year, edition, pages
Taylor & Francis, 2016. p. 1-11
Keywords [en]
collision detection, collaborative assembly, safety, vision sensor
National Category
Engineering and Technology
Research subject
Production Engineering
Identifiers
URN: urn:nbn:se:kth:diva-202380DOI: 10.1080/0951192X.2016.1268269ISI: 000402991300006Scopus ID: 2-s2.0-85006100717OAI: oai:DiVA.org:kth-202380DiVA, id: diva2:1076291
Note

QC 20170308

Available from: 2017-02-22 Created: 2017-02-22 Last updated: 2017-07-03Bibliographically approved
In thesis
1. Toward a Sustainable Human-Robot Collaborative Production Environment
Open this publication in new window or tab >>Toward a Sustainable Human-Robot Collaborative Production Environment
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This PhD study aimed to address the sustainability issues of the robotic systems from the environmental and social aspects. During the research, three approaches were developed: the first one an online programming-free model-driven system that utilises web-based distributed human-robot collaboration architecture to perform distant assembly operations. It uses a robot-mounted camera to capture the silhouettes of the components from different angles. Then the system analyses those silhouettes and constructs the corresponding 3D models.Using the 3D models together with the model of a robotic assembly cell, the system guides a distant human operator to assemble the real components in the actual robotic cell. To satisfy the safety aspect of the human-robot collaboration, a second approach has been developed for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.To tackle the energy aspect of the sustainability for the human-robot production environment, a third approach has been developed which aims to minimise the robot energy consumption during assembly. Given a trajectory and based on the inverse kinematics and dynamics of a robot, a set of attainable configurations for the robot can be determined, perused by calculating the suitable forces and torques on the joints and links of the robot. The energy consumption is then calculated for each configuration and based on the assigned trajectory. The ones with the lowest energy consumption are selected.

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2017. p. 98
Series
TRITA-IIP, ISSN 1650-1888 ; 17-01
Keywords
vision sensor, 3D image processing, collision detection, safety, robot, kinematics, dynamics, collaborative assembly, energy consumption, optimisation, manufacturing
National Category
Engineering and Technology
Research subject
Production Engineering
Identifiers
urn:nbn:se:kth:diva-202388 (URN)978-91-7729-301-9 (ISBN)
Public defence
2017-03-24, M311, Brinellvägen 68, Stockholm, 10:00 (English)
Opponent
Supervisors
Note

QC 20170223

Available from: 2017-02-23 Created: 2017-02-22 Last updated: 2017-02-23Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Wang, Lihui

Search in DiVA

By author/editor
Mohammed, AbdullahWang, Lihui
By organisation
Production Engineering
In the same journal
International journal of computer integrated manufacturing (Print)
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 307 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf