Open this publication in new window or tab >>2013 (English)In: 2013 IEEE International Conference on Robotics and Automation (ICRA), IEEE conference proceedings, 2013, p. 1282-1289Conference paper, Published paper (Refereed)
Abstract [en]
The ability to learn from human demonstration is essential for robots in human environments. The activity models that the robot builds from observation must take both the human motion and the objects involved into account. Object models designed for this purpose should reflect the role of the object in the activity - its function, or affordances. The main contribution of this paper is to represent object directly in terms of their interaction with human hands, rather than in terms of appearance. This enables the direct representation of object affordances/function, while being robust to intra-class differences in appearance. Object hypotheses are first extracted from a video sequence as tracks of associated image segments. The object hypotheses are encoded as strings, where the vocabulary corresponds to different types of interaction with human hands. The similarity between two such object descriptors can be measured using a string kernel. Experiments show these functional descriptors to capture differences and similarities in object affordances/function that are not represented by appearance.
Place, publisher, year, edition, pages
IEEE conference proceedings, 2013
Series
IEEE International Conference on Robotics and Automation, ISSN 1050-4729
Keywords
Activity models, Functional object, Human activities, Human demonstrations, Human environment, Image segments, Object descriptors, Video sequences
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:kth:diva-138526 (URN)10.1109/ICRA.2013.6630736 (DOI)000337617301042 ()2-s2.0-84887281861 (Scopus ID)978-1-4673-5641-1 (ISBN)
Conference
2013 IEEE International Conference on Robotics and Automation, ICRA 2013; Karlsruhe; Germany; 6 May 2013 through 10 May 2013
Note
QC 20140107
2013-12-192013-12-192025-02-07Bibliographically approved