Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Analytic grasp success prediction with tactile feedback
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.ORCID iD: 0000-0003-2965-2953
2016 (English)In: Proceedings - IEEE International Conference on Robotics and Automation, Institute of Electrical and Electronics Engineers (IEEE), 2016, 165-171 p.Conference paper (Refereed)
Abstract [en]

Predicting grasp success is useful for avoiding failures in many robotic applications. Based on reasoning in wrench space, we address the question of how well analytic grasp success prediction works if tactile feedback is incorporated. Tactile information can alleviate contact placement uncertainties and facilitates contact modeling. We introduce a wrench-based classifier and evaluate it on a large set of real grasps. The key finding of this work is that exploiting tactile information allows wrench-based reasoning to perform on a level with existing methods based on learning or simulation. Different from these methods, the suggested approach has no need for training data, requires little modeling effort and is computationally efficient. Furthermore, our method affords task generalization by considering the capabilities of the grasping device and expected disturbance forces/moments in a physically meaningful way.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2016. 165-171 p.
Series
Proceedings - IEEE International Conference on Robotics and Automation, ISSN 1050-4729
Keyword [en]
Forecasting Tools Uncertainty analysis Based reasonings Computationally efficient Contact modeling Disturbance force Robotic applications Tactile feedback Tactile information Training data
National Category
Robotics
Identifiers
URN: urn:nbn:se:kth:diva-194529DOI: 10.1109/ICRA.2016.7487130ISI: 000389516200024ScopusID: 2-s2.0-84977515559ISBN: 978-1-4673-8026-3 (print)OAI: oai:DiVA.org:kth-194529DiVA: diva2:1043814
Conference
2016 IEEE International Conference on Robotics and Automation, ICRA 2016, 16 May 2016 through 21 May 2016
Funder
Knut and Alice Wallenberg Foundation
Note

QC 20161101

Available from: 2016-11-01 Created: 2016-10-31 Last updated: 2017-01-19Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Kragic, Danica
By organisation
Computer Vision and Active Perception, CVAP
Robotics

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 279 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf