Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Enhancing geometric maps through environmental interactions
KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, perception and learning, RPL. (RPL)ORCID iD: 0000-0002-6716-1111
2018 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The deployment of rescue robots in real operations is becoming increasingly commonthanks to recent advances in AI technologies and high performance hardware. Rescue robots can now operate for extended period of time, cover wider areas andprocess larger amounts of sensory information making them considerably more usefulduring real life threatening situations, including both natural or man-made disasters.

In this thesis we present results of our research which focuses on investigating ways of enhancing visual perception for Unmanned Ground Vehicles (UGVs) through environmental interactions using different sensory systems, such as tactile sensors and wireless receivers.

We argue that a geometric representation of the robot surroundings built upon vision data only, may not suffice in overcoming challenging scenarios, and show that robot interactions with the environment can provide a rich layer of new information that needs to be suitably represented and merged into the cognitive world model. Visual perception for mobile ground vehicles is one of the fundamental problems in rescue robotics. Phenomena such as rain, fog, darkness, dust, smoke and fire heavily influence the performance of visual sensors, and often result in highly noisy data, leading to unreliable or incomplete maps.

We address this problem through a collection of studies and structure the thesis as follow:Firstly, we give an overview of the Search & Rescue (SAR) robotics field, and discuss scenarios, hardware and related scientific questions.Secondly, we focus on the problems of control and communication. Mobile robotsrequire stable communication with the base station to exchange valuable information. Communication loss often presents a significant mission risk and disconnected robotsare either abandoned, or autonomously try to back-trace their way to the base station. We show how non-visual environmental properties (e.g. the WiFi signal distribution) can be efficiently modeled using probabilistic active perception frameworks based on Gaussian Processes, and merged into geometric maps so to facilitate the SAR mission. We then show how to use tactile perception to enhance mapping. Implicit environmental properties such as the terrain deformability, are analyzed through strategic glancesand touches and then mapped into probabilistic models.Lastly, we address the problem of reconstructing objects in the environment. Wepresent a technique for simultaneous 3D reconstruction of static regions and rigidly moving objects in a scene that enables on-the-fly model generation. Although this thesis focuses mostly on rescue UGVs, the concepts presented canbe applied to other mobile platforms that operates under similar circumstances. To make sure that the suggested methods work, we have put efforts into design of user interfaces and the evaluation of those in user studies.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2018. , p. 58
Series
TRITA-EECS-AVL ; 2018:26
Keywords [en]
Gaussian Processes Robotics UGV Active perception geometric maps
National Category
Engineering and Technology
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-225957ISBN: 978-91-7729-720-8 (print)OAI: oai:DiVA.org:kth-225957DiVA, id: diva2:1196889
Public defence
2018-04-18, F3, Lindstedtsvägen 26, Sing-Sing, floor 2, KTH Campus, Stockholm, 10:00 (English)
Opponent
Supervisors
Funder
EU, FP7, Seventh Framework Programme
Note

QC 20180411

Available from: 2018-04-11 Created: 2018-04-11 Last updated: 2018-04-11Bibliographically approved
List of papers
1. Free Look UGV Teleoperation Control Tested in Game Environment: Enhanced Performance and Reduced Workload
Open this publication in new window or tab >>Free Look UGV Teleoperation Control Tested in Game Environment: Enhanced Performance and Reduced Workload
Show others...
2016 (English)In: International Symposium on Safety,Security and Rescue Robotics, 2016Conference paper, Published paper (Refereed)
Abstract [en]

Concurrent telecontrol of the chassis and camera ofan Unmanned Ground Vehicle (UGV) is a demanding task forUrban Search and Rescue (USAR) teams. The standard way ofcontrolling UGVs is called Tank Control (TC), but there is reasonto believe that Free Look Control (FLC), a control mode used ingames, could reduce this load substantially by decoupling, andproviding separate controls for, camera translation and rotation.The general hypothesis is that FLC (1) reduces robot operators’workload and (2) enhances their performance for dynamic andtime-critical USAR scenarios. A game-based environment wasset-up to systematically compare FLC with TC in two typicalsearch and rescue tasks: navigation and exploration. The resultsshow that FLC improves mission performance in both exploration(search) and path following (navigation) scenarios. In the former,more objects were found, and in the latter shorter navigationtimes were achieved. FLC also caused lower workload and stresslevels in both scenarios, without inducing a significant differencein the number of collisions. Finally, FLC was preferred by 75% of the subjects for exploration, and 56% for path following.

Keywords
Teleoperation, UGV, Search and Rescue, First Response, Disaster Response, FPS, Computer Game
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-192941 (URN)10.1109/SSRR.2016.7784321 (DOI)000391310800053 ()2-s2.0-85009804146 (Scopus ID)
Conference
International Symposium on Safety, Security and Rescue Robotics, Lausanne, October 23-27th, 2016
Projects
TRADR
Funder
EU, FP7, Seventh Framework Programme, FP7-ICT-609763
Note

QC 20161212

Available from: 2016-09-26 Created: 2016-09-23 Last updated: 2018-04-11Bibliographically approved
2. Extending a UGV Teleoperation FLC Interface with Wireless Network Connectivity Information
Open this publication in new window or tab >>Extending a UGV Teleoperation FLC Interface with Wireless Network Connectivity Information
2015 (English)In: 2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), IEE , 2015, p. 4305-4312Conference paper, Published paper (Refereed)
Abstract [en]

Teleoperated Unmanned Ground Vehicles (UGVs) are expected to play an important role in future search and rescue operations. In such tasks, two factors are crucial for a successful mission completion: operator situational awareness and robust network connectivity between operator and UGV. In this paper, we address both these factors by extending a new Free Look Control (FLC) operator interface with a graphical representation of the Radio Signal Strength (RSS) gradient at the UGV location. We also provide a new way of estimating this gradient using multiple receivers with directional antennas. The proposed approach allows the operator to stay focused on the video stream providing the crucial situational awareness, while controlling the UGV to complete the mission without moving into areas with dangerously low wireless connectivity. The approach is implemented on a KUKA youBot using commercial-off-the-shelf components. We provide experimental results showing how the proposed RSS gradient estimation method performs better than a difference approximation using omnidirectional antennas and verify that it is indeed useful for predicting the RSS development along a UGV trajectory. We also evaluate the proposed combined approach in terms of accuracy, precision, sensitivity and specificity.

Place, publisher, year, edition, pages
IEE, 2015
Series
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:kth:diva-185108 (URN)10.1109/IROS.2015.7353987 (DOI)000371885404073 ()2-s2.0-84958172076 (Scopus ID)978-1-4799-9994-1 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), SEP 28-OCT 02, 2015, Hamburg, GERMANY
Note

QC 20160412

Available from: 2016-04-12 Created: 2016-04-11 Last updated: 2018-04-11Bibliographically approved
3. A New UGV Teleoperation Interface for Improved Awareness of Network Connectivity and Physical Surroundings
Open this publication in new window or tab >>A New UGV Teleoperation Interface for Improved Awareness of Network Connectivity and Physical Surroundings
Show others...
2017 (English)In: Journal of Human-Robot Interaction, E-ISSN 2163-0364, Vol. 6, no 3, p. 48-70Article in journal (Refereed) Published
Abstract [en]

A reliable wireless connection between the operator and the teleoperated unmanned ground vehicle (UGV) is critical in many urban search and rescue (USAR) missions. Unfortunately, as was seen in, for example, the Fukushima nuclear disaster, the networks available in areas where USAR missions take place are often severely limited in range and coverage. Therefore, during mission execution, the operator needs to keep track of not only the physical parts of the mission, such as navigating through an area or searching for victims, but also the variations in network connectivity across the environment. In this paper, we propose and evaluate a new teleoperation user interface (UI) that includes a way of estimating the direction of arrival (DoA) of the radio signal strength (RSS) and integrating the DoA information in the interface. The evaluation shows that using the interface results in more objects found, and less aborted missions due to connectivity problems, as compared to a standard interface. The proposed interface is an extension to an existing interface centered on the video stream captured by the UGV. But instead of just showing the network signal strength in terms of percent and a set of bars, the additional information of DoA is added in terms of a color bar surrounding the video feed. With this information, the operator knows what movement directions are safe, even when moving in regions close to the connectivity threshold.

Place, publisher, year, edition, pages
Journal of Human-Robot Interaction, 2017
Keywords
teleoperation, UGV, search and rescue, FLC, network connectivity, user interface
National Category
Robotics
Identifiers
urn:nbn:se:kth:diva-223539 (URN)10.5898/JHRI.6.3.Parasuraman (DOI)000424170700004 ()
Funder
EU, FP7, Seventh Framework Programme, FP7-ICT-609763 TRADR
Note

QC 20180222

Available from: 2018-02-22 Created: 2018-02-22 Last updated: 2018-04-11Bibliographically approved
4. RCAMP: A Resilient Communication-Aware Motion Planner for Mobile Robots with Autonomous Repair of Wireless Connectivity
Open this publication in new window or tab >>RCAMP: A Resilient Communication-Aware Motion Planner for Mobile Robots with Autonomous Repair of Wireless Connectivity
Show others...
2017 (English)In: 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) / [ed] Bicchi, A Okamura, A, IEEE , 2017, p. 2010-2017Conference paper, Published paper (Refereed)
Abstract [en]

Mobile robots, be it autonomous or teleoperated, require stable communication with the base station to exchange valuable information. Given the stochastic elements in radio signal propagation, such as shadowing and fading, and the possibilities of unpredictable events or hardware failures, communication loss often presents a significant mission risk, both in terms of probability and impact, especially in Urban Search and Rescue (USAR) operations. Depending on the circumstances, disconnected robots are either abandoned, or attempt to autonomously back-trace their way to the base station. Although recent results in Communication-Aware Motion Planning can be used to effectively manage connectivity with robots, there are no results focusing on autonomously re-establishing the wireless connectivity of a mobile robot without back-tracing or using detailed a priori information of the network. In this paper, we present a robust and online radio signal mapping method using Gaussian Random Fields, and propose a Resilient Communication-Aware Motion Planner (RCAMP) that integrates the above signal mapping framework with a motion planner. RCAMP considers both the environment and the physical constraints of the robot, based on the available sensory information. We also propose a self-repair strategy using RCMAP, that takes both connectivity and the goal position into account when driving to a connection-safe position in the event of a communication loss. We demonstrate the proposed planner in a set of realistic simulations of an exploration task in single or multi-channel communication scenarios.

Place, publisher, year, edition, pages
IEEE, 2017
Series
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858
Keywords
Mobile Robots, Self-Repair, Wireless Communication, Communication-Aware Motion Planning
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-225803 (URN)10.1109/IROS.2017.8206020 (DOI)000426978202045 ()2-s2.0-85041962473 (Scopus ID)978-1-5386-2682-5 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), SEP 24-28, 2017, Vancouver, CANADA
Note

QC 20180409

Available from: 2018-04-09 Created: 2018-04-09 Last updated: 2019-04-09Bibliographically approved
5. Active Exploration Using Gaussian Random Fields and Gaussian Process Implicit Surfaces
Open this publication in new window or tab >>Active Exploration Using Gaussian Random Fields and Gaussian Process Implicit Surfaces
2016 (English)In: 2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), Institute of Electrical and Electronics Engineers (IEEE), 2016, p. 582-589Conference paper, Published paper (Refereed)
Abstract [en]

In this work we study the problem of exploring surfaces and building compact 3D representations of the environment surrounding a robot through active perception. We propose an online probabilistic framework that merges visual and tactile measurements using Gaussian Random Field and Gaussian Process Implicit Surfaces. The system investigates incomplete point clouds in order to find a small set of regions of interest which are then physically explored with a robotic arm equipped with tactile sensors. We show experimental results obtained using a PrimeSense camera, a Kinova Jaco2 robotic arm and Optoforce sensors on different scenarios. We then demostrate how to use the online framework for object detection and terrain classification.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2016
Keywords
Active perception, Surface reconstruction, Gaussian process, Implicit surface, Random field, Tactile exploration
National Category
Robotics
Identifiers
urn:nbn:se:kth:diva-202672 (URN)10.1109/IROS.2016.7759112 (DOI)000391921700086 ()2-s2.0-85006371409 (Scopus ID)978-1-5090-3762-9 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), OCT 09-14, 2016, Daejeon, SOUTH KOREA
Note

QC 20170306

Available from: 2017-03-06 Created: 2017-03-06 Last updated: 2018-04-11Bibliographically approved
6. Active perception and modeling of deformable surfaces using Gaussian processes and position-based dynamics
Open this publication in new window or tab >>Active perception and modeling of deformable surfaces using Gaussian processes and position-based dynamics
2016 (English)In: IEEE-RAS International Conference on Humanoid Robots, IEEE, 2016, p. 530-537Conference paper, Published paper (Refereed)
Abstract [en]

Exploring and modeling heterogeneous elastic surfaces requires multiple interactions with the environment and a complex selection of physical material parameters. The most common approaches model deformable properties from sets of offline observations using computationally expensive force-based simulators. In this work we present an online probabilistic framework for autonomous estimation of a deformability distribution map of heterogeneous elastic surfaces from few physical interactions. The method takes advantage of Gaussian Processes for constructing a model of the environment geometry surrounding a robot. A fast Position-based Dynamics simulator uses focused environmental observations in order to model the elastic behavior of portions of the environment. Gaussian Process Regression maps the local deformability on the whole environment in order to generate a deformability distribution map. We show experimental results using a PrimeSense camera, a Kinova Jaco2 robotic arm and an Optoforce sensor on different deformable surfaces.

Place, publisher, year, edition, pages
IEEE, 2016
Keywords
Active perception, Deformability modeling, Gaussian process, Position-based dynamics, Tactile exploration, Anthropomorphic robots, Deformation, Dynamics, Gaussian noise (electronic), Probability distributions, Robots, Active perceptions, Environmental observation, Gaussian process regression, Gaussian Processes, Multiple interactions, Physical interactions, Probabilistic framework, Gaussian distribution
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:kth:diva-202842 (URN)10.1109/HUMANOIDS.2016.7803326 (DOI)000403009300081 ()2-s2.0-85010190205 (Scopus ID)9781509047185 (ISBN)
Conference
16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016, 15 November 2016 through 17 November 2016
Note

QC 20170317

Available from: 2017-03-17 Created: 2017-03-17 Last updated: 2018-04-11Bibliographically approved
7. Joint 3D Reconstruction of a Static Scene and Moving Objects
Open this publication in new window or tab >>Joint 3D Reconstruction of a Static Scene and Moving Objects
2017 (English)In: Proceedings of the 2017International Conference on 3D Vision (3DV’17), 2017Conference paper, Published paper (Other academic)
National Category
Other Engineering and Technologies not elsewhere specified
Identifiers
urn:nbn:se:kth:diva-225961 (URN)10.1109/3DV.2017.00082 (DOI)2-s2.0-85048741075 (Scopus ID)9781538626108 (ISBN)
Conference
the 2017 International Conference on 3D Vision (3DV’17) , Qingdao, China, October
Note

QC 20180411

Available from: 2018-04-11 Created: 2018-04-11 Last updated: 2019-04-05Bibliographically approved

Open Access in DiVA

KTH_Thesis_SergioCaccamo_RPL_RoboticsComputerVision(5704 kB)71 downloads
File information
File name FULLTEXT01.pdfFile size 5704 kBChecksum SHA-512
5b9f56e378c51d4a54ddf2a6332e1b600e0bbbb78ab9e664cbf82f54062556ec3df454cba1ff197d8e36b677f2c1513ba0dd9f9e120a4c21d9cf4b85e8430fe9
Type fulltextMimetype application/pdf

Authority records BETA

Caccamo, Sergio

Search in DiVA

By author/editor
Caccamo, Sergio
By organisation
Robotics, perception and learning, RPL
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar
Total: 71 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 366 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf