Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Learning to Assess Grasp Stability from Vision, Touch and Proprioception
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
2012 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

Grasping and manipulation of objects is an integral part of a robot’s physical interaction with the environment. In order to cope with real-world situations, sensor based grasping of objects and grasp stability estimation is an important skill. This thesis addresses the problem of predicting the stability of a grasp from the perceptions available to a robot once fingers close around the object before attempting to lift it. A regrasping step can be triggered if an unstable grasp is identified. The percepts considered consist of object features (visual), gripper configurations (proprioceptive) and tactile imprints (haptic) when fingers contact the object. This thesis studies tactile based stability estimation by applying machine learning methods such as Hidden Markov Models. An approach to integrate visual and tactile feedback is also introduced to further improve the predictions of grasp stability, using Kernel Logistic Regression models.

Like humans, robots are expected to grasp and manipulate objects in a goal-oriented manner. In other words, objects should be grasped so to afford subsequent actions: if I am to hammer a nail, the hammer should be grasped so to afford hammering. Most of the work on grasping commonly addresses only the problem of finding a stable grasp without considering the task/action a robot is supposed to fulfill with an object. This thesis also studies grasp stability assessment in a task-oriented way based on a generative approach using probabilistic graphical models, Bayesian Networks. We integrate high-level task information introduced by a teacher in a supervised setting with low-level stability requirements acquired through a robot’s exploration. The graphical model is used to encode probabilistic relationships between tasks and sensory data (visual, tactile and proprioceptive). The generative modeling approach enables inference of appropriate grasping configurations, as well as prediction of grasp stability. Overall, results indicate that the idea of exploiting learning approaches for grasp stability assessment is applicable in realistic scenarios.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2012. , vi, 99 p.
Series
Trita-CSC-A, ISSN 1653-5723 ; 2012:12
Keyword [en]
Robotic grasping, Machine Learning, Tactile Sensing
National Category
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-104035ISBN: 978-91-7501-522-4 (print)OAI: oai:DiVA.org:kth-104035DiVA: diva2:562726
Public defence
2012-11-14, F3, Lindstedtsvägen 26, Kungliga Tekniska Högskolan, Stockholm, 10:00 (English)
Opponent
Supervisors
Funder
ICT - The Next Generation
Note

QC 20121026

Available from: 2012-10-26 Created: 2012-10-25 Last updated: 2013-04-15Bibliographically approved

Open Access in DiVA

Yasemin_Bekiroglu_thesis(18946 kB)1581 downloads
File information
File name FULLTEXT01.pdfFile size 18946 kBChecksum SHA-512
bd14678a59c8a8ad3eac54e06468b14ae30a2247d7ad10cdf1dd650200017959c8735b0f93d36855d49e89b558c2f3f86d32b92bb96288a82b7921af83407aca
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Bekiroglu, Yasemin
By organisation
Computer Vision and Active Perception, CVAP
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 1581 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 648 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf