kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Team KTH’s Picking Solution for the Amazon Picking Challenge 2016
KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.ORCID iD: 0000-0003-3252-715X
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.ORCID iD: 0000-0002-3111-3812
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0002-6716-1111
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
Show others and affiliations
2017 (English)In: Warehouse Picking Automation Workshop 2017: Solutions, Experience, Learnings and Outlook of the Amazon Robotics Challenge, 2017Conference paper, Oral presentation only (Other (popular science, discussion, etc.))
Abstract [en]

In this work we summarize the solution developed by Team KTH for the Amazon Picking Challenge 2016 in Leipzig, Germany. The competition simulated a warehouse automation scenario and it was divided in two tasks: a picking task where a robot picks items from a shelf and places them in a tote and a stowing task which is the inverse task where the robot picks items from a tote and places them in a shelf. We describe our approach to the problem starting from a high level overview of our system and later delving into details of our perception pipeline and our strategy for manipulation and grasping. The solution was implemented using a Baxter robot equipped with additional sensors.

Place, publisher, year, edition, pages
2017.
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-215327OAI: oai:DiVA.org:kth-215327DiVA, id: diva2:1147692
Conference
ICRA 2017
Note

QC 20171009

Available from: 2017-10-07 Created: 2017-10-07 Last updated: 2025-02-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Conference webpage

Authority records

Almeida, DiogoAmbrus, RaresCaccamo, SergioChen, XiCruciani, SilviaPinto Basto De Carvalho, Joao FHaustein, JoshuaMarzinotto, AlejandroVina, FranciscoKarayiannidis, YiannisÖgren, PetterJensfelt, PatricKragic, Danica

Search in DiVA

By author/editor
Almeida, DiogoAmbrus, RaresCaccamo, SergioChen, XiCruciani, SilviaPinto Basto De Carvalho, Joao FHaustein, JoshuaMarzinotto, AlejandroVina, FranciscoKarayiannidis, YiannisÖgren, PetterJensfelt, PatricKragic, Danica
By organisation
Robotics, perception and learning, RPLComputer Vision and Active Perception, CVAPRobotics, Perception and Learning, RPLKTHOptimization and Systems Theory
Robotics and automation

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 2314 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf