Learning Object Properties From Manipulation for Manipulation
2017 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]
The world contains objects with various properties - rigid, granular, liquid, elastic or plastic. As humans, while interacting with the objects, we plan our manipulation by considering their properties. For instance, while holding a rigid object such as a brick, we adapt our grasp based on its centre of mass not to drop it. On the other hand while manipulating a deformable object, we may consider additional properties to the centre of mass such elasticity, brittleness etc. for grasp stability. Therefore, knowing object properties is an integral part of skilled manipulation of objects.
For manipulating objects skillfully, robots should be able to predict the object properties as humans do. To predict the properties, interactions with objects are essential. These interactions give rise distinct sensory signals that contains information about the object properties. The signals coming from a single sensory modality may give ambiguous information or noisy measurements. Hence, by integrating multi-sensory modalities (vision, touch, audio or proprioceptive), a manipulated object can be observed from different aspects and this can decrease the uncertainty in the observed properties. By analyzing the perceived sensory signals, a robot reasons about the object properties and adjusts its manipulation based on this information. During this adjustment, the robot can make use of a simulation model to predict the object behavior to plan the next action. For instance, if an object is assumed to be rigid before interaction and exhibit deformable behavior after interaction, an internal simulation model can be used to predict the load force exerted on the object, so that appropriate manipulation can be planned in the next action. Thus, learning about object properties can be defined as an active procedure. The robot explores the object properties actively and purposefully by interacting with the object, and adjusting its manipulation based on the sensory information and predicted object behavior through an internal simulation model.
This thesis investigates the necessary mechanisms that we mentioned above to learn object properties: (i) multi-sensory information, (ii) simulation and (iii) active exploration. In particular, we investigate these three mechanisms that represent different and complementary ways of extracting a certain object property, the deformability of objects. Firstly, we investigate the feasibility of using visual and/or tactile data to classify the content of a container based on the deformation observed when a robotic hand squeezes and deforms the container. According to our result, both visual and tactile sensory data individually give high accuracy rates while classifying the content type based on the deformation. Next, we investigate the usage of a simulation model to estimate the object deformability that is revealed through a manipulation. The proposed method identify accurately the deformability of the test objects in synthetic and real-world data. Finally, we investigate the integration of the deformation simulation in a robotic active perception framework to extract the heterogenous deformability properties of an environment through physical interactions. In the experiments that we apply on real-world objects, we illustrate that the active perception framework can map the heterogeneous deformability properties of a surface.
Place, publisher, year, edition, pages
US-AB: KTH Royal Institute of Technology, 2017. , p. 109
Series
TRITA-CSC-A, ISSN 1653-5723 ; 2017:16
Keywords [en]
robotics, manipulation, object properties, simulation, machine learning, object tracking, active exploration, sensory integration
National Category
Robotics
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-207154ISBN: 978-91-7729-411-5 (print)OAI: oai:DiVA.org:kth-207154DiVA, id: diva2:1096344
Public defence
2017-06-13, F3, Lindstedtsvägen 26, Stockholm, 09:00 (English)
Opponent
Supervisors
Funder
EU, FP7, Seventh Framework Programme, ICT-288533
Note
QC 20170517
2017-05-172017-05-172022-06-27Bibliographically approved