kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Learning a dictionary of prototypical grasp-predicting parts from grasping experience
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.ORCID iD: 0000-0003-2965-2953
2013 (English)In: 2013 IEEE International Conference on Robotics and Automation (ICRA), New York: IEEE , 2013, p. 601-608Conference paper, Published paper (Refereed)
Abstract [en]

We present a real-world robotic agent that is capable of transferring grasping strategies across objects that share similar parts. The agent transfers grasps across objects by identifying, from examples provided by a teacher, parts by which objects are often grasped in a similar fashion. It then uses these parts to identify grasping points onto novel objects. We focus our report on the definition of a similarity measure that reflects whether the shapes of two parts resemble each other, and whether their associated grasps are applied near one another. We present an experiment in which our agent extracts five prototypical parts from thirty-two real-world grasp examples, and we demonstrate the applicability of the prototypical parts for grasping novel objects.

Place, publisher, year, edition, pages
New York: IEEE , 2013. p. 601-608
Series
IEEE International Conference on Robotics and Automation, ISSN 1050-4729
Keywords [en]
Robotics, Grasping, Grasping strategy, Object grasping, Prototypical grasp-predicting part, Dimensionality reduction
National Category
Computer graphics and computer vision Robotics and automation
Identifiers
URN: urn:nbn:se:kth:diva-136374DOI: 10.1109/ICRA.2013.6630635ISI: 000337617300088Scopus ID: 2-s2.0-84887312609ISBN: 978-1-4673-5641-1 (print)OAI: oai:DiVA.org:kth-136374DiVA, id: diva2:675933
Conference
2013 IEEE International Conference on Robotics and Automation, ICRA 2013; Karlsruhe; Germany; 6 May 2013 through 10 May 2013
Funder
EU, FP7, Seventh Framework Programme, FP7-IP-027657Swedish Foundation for Strategic Research Swedish Research CouncilEU, FP7, Seventh Framework Programme, IST-FP7-270436
Note

QC 20131216

Available from: 2013-12-04 Created: 2013-12-04 Last updated: 2025-02-05Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopusIEEEXplore

Authority records

Detry, RenaudEk, Carl HenrikKragic, Danica

Search in DiVA

By author/editor
Detry, RenaudEk, Carl HenrikMadry, MariannaKragic, Danica
By organisation
Computer Vision and Active Perception, CVAPCentre for Autonomous Systems, CAS
Computer graphics and computer visionRobotics and automation

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 221 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf