Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Remote robotic assembly guided by 3D models linking to a real robot
KTH, Skolan för industriell teknik och management (ITM), Industriell produktion.ORCID-id: 0000-0001-8679-8049
KTH, Skolan för industriell teknik och management (ITM), Industriell produktion.
KTH, Skolan för industriell teknik och management (ITM), Industriell produktion.ORCID-id: 0000-0002-0006-283X
2014 (engelsk)Inngår i: CIRP annals, ISSN 0007-8506, E-ISSN 1726-0604, Vol. 63, nr 1, s. 1-4Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

This paper presents a 3D model-driven remote robotic assembly system. It constructs 3D models at runtime to represent unknown geometries at the robot side, where a sequence of images from a calibrated camera in different poses is used. Guided by the 3D models over the Internet, a remote operator can manipulate a real robot instantly for remote assembly operations. Experimental results show that the system is feasible to meet industrial assembly requirements with an acceptable level of modelling quality and relatively short processing time. The system also enables programming-free robotic assembly where the real robot follows the human's assembly operations instantly.

sted, utgiver, år, opplag, sider
Elsevier, 2014. Vol. 63, nr 1, s. 1-4
Emneord [en]
Robot, 3D-image processing, Assembly
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-148644DOI: 10.1016/j.cirp.2014.03.013ISI: 000338811000001Scopus ID: 2-s2.0-84902543990OAI: oai:DiVA.org:kth-148644DiVA, id: diva2:737027
Merknad

QC 20140811

Tilgjengelig fra: 2014-08-11 Laget: 2014-08-11 Sist oppdatert: 2017-12-05bibliografisk kontrollert
Inngår i avhandling
1. Toward a Sustainable Human-Robot Collaborative Production Environment
Åpne denne publikasjonen i ny fane eller vindu >>Toward a Sustainable Human-Robot Collaborative Production Environment
2017 (engelsk)Doktoravhandling, med artikler (Annet vitenskapelig)
Abstract [en]

This PhD study aimed to address the sustainability issues of the robotic systems from the environmental and social aspects. During the research, three approaches were developed: the first one an online programming-free model-driven system that utilises web-based distributed human-robot collaboration architecture to perform distant assembly operations. It uses a robot-mounted camera to capture the silhouettes of the components from different angles. Then the system analyses those silhouettes and constructs the corresponding 3D models.Using the 3D models together with the model of a robotic assembly cell, the system guides a distant human operator to assemble the real components in the actual robotic cell. To satisfy the safety aspect of the human-robot collaboration, a second approach has been developed for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.To tackle the energy aspect of the sustainability for the human-robot production environment, a third approach has been developed which aims to minimise the robot energy consumption during assembly. Given a trajectory and based on the inverse kinematics and dynamics of a robot, a set of attainable configurations for the robot can be determined, perused by calculating the suitable forces and torques on the joints and links of the robot. The energy consumption is then calculated for each configuration and based on the assigned trajectory. The ones with the lowest energy consumption are selected.

sted, utgiver, år, opplag, sider
KTH Royal Institute of Technology, 2017. s. 98
Serie
TRITA-IIP, ISSN 1650-1888 ; 17-01
Emneord
vision sensor, 3D image processing, collision detection, safety, robot, kinematics, dynamics, collaborative assembly, energy consumption, optimisation, manufacturing
HSV kategori
Forskningsprogram
Industriell produktion
Identifikatorer
urn:nbn:se:kth:diva-202388 (URN)978-91-7729-301-9 (ISBN)
Disputas
2017-03-24, M311, Brinellvägen 68, Stockholm, 10:00 (engelsk)
Opponent
Veileder
Merknad

QC 20170223

Tilgjengelig fra: 2017-02-23 Laget: 2017-02-22 Sist oppdatert: 2017-02-23bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Personposter BETA

Wang, LihuiOnori, Mauro

Søk i DiVA

Av forfatter/redaktør
Wang, LihuiMohammed, AbdullahOnori, Mauro
Av organisasjonen
I samme tidsskrift
CIRP annals

Søk utenfor DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric

doi
urn-nbn
Totalt: 192 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf