Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Remote robotic assembly guided by 3D models linking to a real robot
KTH, School of Industrial Engineering and Management (ITM), Production Engineering.ORCID iD: 0000-0001-8679-8049
KTH, School of Industrial Engineering and Management (ITM), Production Engineering.
KTH, School of Industrial Engineering and Management (ITM), Production Engineering.ORCID iD: 0000-0002-0006-283X
2014 (English)In: CIRP annals, ISSN 0007-8506, E-ISSN 1726-0604, Vol. 63, no 1, 1-4 p.Article in journal (Refereed) Published
Abstract [en]

This paper presents a 3D model-driven remote robotic assembly system. It constructs 3D models at runtime to represent unknown geometries at the robot side, where a sequence of images from a calibrated camera in different poses is used. Guided by the 3D models over the Internet, a remote operator can manipulate a real robot instantly for remote assembly operations. Experimental results show that the system is feasible to meet industrial assembly requirements with an acceptable level of modelling quality and relatively short processing time. The system also enables programming-free robotic assembly where the real robot follows the human's assembly operations instantly.

Place, publisher, year, edition, pages
Elsevier, 2014. Vol. 63, no 1, 1-4 p.
Keyword [en]
Robot, 3D-image processing, Assembly
National Category
Mechanical Engineering
Identifiers
URN: urn:nbn:se:kth:diva-148644DOI: 10.1016/j.cirp.2014.03.013ISI: 000338811000001Scopus ID: 2-s2.0-84902543990OAI: oai:DiVA.org:kth-148644DiVA: diva2:737027
Note

QC 20140811

Available from: 2014-08-11 Created: 2014-08-11 Last updated: 2017-12-05Bibliographically approved
In thesis
1. Toward a Sustainable Human-Robot Collaborative Production Environment
Open this publication in new window or tab >>Toward a Sustainable Human-Robot Collaborative Production Environment
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This PhD study aimed to address the sustainability issues of the robotic systems from the environmental and social aspects. During the research, three approaches were developed: the first one an online programming-free model-driven system that utilises web-based distributed human-robot collaboration architecture to perform distant assembly operations. It uses a robot-mounted camera to capture the silhouettes of the components from different angles. Then the system analyses those silhouettes and constructs the corresponding 3D models.Using the 3D models together with the model of a robotic assembly cell, the system guides a distant human operator to assemble the real components in the actual robotic cell. To satisfy the safety aspect of the human-robot collaboration, a second approach has been developed for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.To tackle the energy aspect of the sustainability for the human-robot production environment, a third approach has been developed which aims to minimise the robot energy consumption during assembly. Given a trajectory and based on the inverse kinematics and dynamics of a robot, a set of attainable configurations for the robot can be determined, perused by calculating the suitable forces and torques on the joints and links of the robot. The energy consumption is then calculated for each configuration and based on the assigned trajectory. The ones with the lowest energy consumption are selected.

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2017. 98 p.
Series
TRITA-IIP, ISSN 1650-1888 ; 17-01
Keyword
vision sensor, 3D image processing, collision detection, safety, robot, kinematics, dynamics, collaborative assembly, energy consumption, optimisation, manufacturing
National Category
Engineering and Technology
Research subject
Production Engineering
Identifiers
urn:nbn:se:kth:diva-202388 (URN)978-91-7729-301-9 (ISBN)
Public defence
2017-03-24, M311, Brinellvägen 68, Stockholm, 10:00 (English)
Opponent
Supervisors
Note

QC 20170223

Available from: 2017-02-23 Created: 2017-02-22 Last updated: 2017-02-23Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Authority records BETA

Wang, LihuiOnori, Mauro

Search in DiVA

By author/editor
Wang, LihuiMohammed, AbdullahOnori, Mauro
By organisation
Production Engineering
In the same journal
CIRP annals
Mechanical Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 100 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf