Learning Tactile Characterizations Of Object- And Pose-specific Grasps
2011 (English)Conference paper (Refereed)
Our aim is to predict the stability of a grasp from the perceptions available to a robot before attempting to lift up and transport an object. The percepts we consider consist of the tactile imprints and the object-gripper configuration read before and until the robot’s manipulator is fully closed around an object. Our robot is equipped with multiple tactile sensing arrays and it is able to track the pose of an object during the application of a grasp. We present a kernel-logistic-regression model of pose- and touch-conditional grasp success probability which we train on grasp data collected by letting the robot experience the effect on tactile and visual signals of grasps suggested by a teacher, and letting the robot verify which grasps can be used to rigidly control the object. We consider models defined on several subspaces of our input data – e.g., using tactile perceptions or pose information only. Our experiment demonstrates that joint tactile and pose-based perceptions carry valuable grasp-related information, as models trained on both hand poses and tactile parameters perform better than the models trained exclusively on one perceptual input.
Place, publisher, year, edition, pages
IEEE conference proceedings, 2011. 1554-1560 p.
, IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
Grasping, Kernel Logistic Regression, Tactile Sensing
Computer Science Robotics
IdentifiersURN: urn:nbn:se:kth:diva-38280DOI: 10.1109/IROS.2011.6094878ISI: 000297477501137ScopusID: 2-s2.0-84455203892ISBN: 978-1-61284-454-1OAI: oai:DiVA.org:kth-38280DiVA: diva2:436513
IEEE/RSJ International Conference on Intelligent Robots and Systems
ProjectsEU FP7 project CogX
FunderICT - The Next Generation
QC 201204032011-08-232011-08-232012-06-13Bibliographically approved