Human-to-Robot Mapping of Grasps
2008 (English)Conference paper (Refereed)
We are developing a Programming by Demonstration (PbD) system for which recognition of objects and pick-and-place actions represent basic building blocks for task learning. An important capability in this system is automatic isual recognition of human grasps, and methods for mapping the human grasps to the functionally corresponding robot grasps. This paper describes the grasp recognition system, focusing on the human-to-robot mapping. The visual grasp classification and grasp orientation regression is described in our IROS 2008 paper . In contrary to earlier approaches, no articulated 3D reconstruction of the hand over time is taking place. The input data consists of a single image of the human hand. The hand shape is classified as one of six grasps by finding similar hand shapes in a large database of grasp images. From the database, the hand orientation is also estimated. The recognized grasp is then mapped to one of three predefined Barrett hand grasps. Depending on the type of robot grasp, a precomputed grasp strategy is selected. The strategy is further parameterized by the orientation of the hand relative to the environment show purposes.
Place, publisher, year, edition, pages
Computer and Information Science
IdentifiersURN: urn:nbn:se:kth:diva-66499OAI: oai:DiVA.org:kth-66499DiVA: diva2:484233
2008 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Nice, France. September 22 - 26 2008
QC 20120127. Invited paper in Grasp and Task Learning by Imitation workshop2012-01-262012-01-262012-01-27Bibliographically approved