Grasping by parts: Robot grasp generation from 3D box primitives
2010 (English)In: 4th International Conference on Cognitive Systems, CogSys 2010, 2010Conference paper, Published paper (Refereed)
Abstract [en]
Robot grasping capabilities are essential for perceiving, interpreting and acting in arbitrary and dynamic environments. While classical computer vision and visual interpretation of scenes focus on the robot's internal representation of the world rather passively, robot grasping capabilities are needed to actively execute tasks, modify scenarios and thereby reach versatile goals. Grasping is a central issue of various robot applications, especially when unknown objects have to be manipulated by the system. We present an approach aimed at the object description, but constrain it by performable actions. In particular, we will connect box-like representations of objects with grasping, and motivate this approach in a number of ways. The contributions of our work are two-fold: in terms of shape approximation, we provide an algorithm for a 3D box primitive representation to identify object parts from 3D point clouds. We motivate and evaluate this choice particularly toward the task of grasping. As a contribution in the field of grasping, we present a grasp hypothesis generation framework that utilizes the box presentation in a highly flexible manner.
Place, publisher, year, edition, pages
2010.
Keywords [en]
3D point cloud, Dynamic environments, Hypothesis generation, Internal representation, Object description, Shape approximation, Unknown objects, Visual interpretation, Approximation algorithms, Robot applications, Robot learning, Three dimensional computer graphics, Cognitive systems
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:kth:diva-148888Scopus ID: 2-s2.0-84878287698OAI: oai:DiVA.org:kth-148888DiVA, id: diva2:738325
Conference
4th International Conference on Cognitive Systems, CogSys 2010, 27 January 2010 through 28 January 2010, Zurich, Switzerland
Note
QC 20140818
2014-08-182014-08-142022-06-23Bibliographically approved