Inertial and Video Data Integration.
Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
As a part of its research projects, the Advanced Studies Department of Thales Simulation is studying mixed reality solutions as an alternative to training simulation. Here, the study focus on the conception and realization of a new kind of 3D positioning, aiming at a supervision system in mixed reality. The core idea is the fusion between heterogeneous data from 6-axis inertial sensors and a camera, to locate and accurately orientate of individuals evolving in a urban area. Using inertial data to perform the estimation of an object's attitude when visual data are not available is a goal that would have a strong impact in geolocalization or human-computer interfaces. But the most obvious thing to do, an integration of acceleration over time to get the position, is in fact theoretically a dead-end. Nevertheless, the eciency and quality of inertial measurement devices is constantly improving since years, and such devices become more and more aordable. The aim of this work was to estimate if, by using professional quality inertial measurement units and combining several methods of signal processing and data analysis, it is possible to obtain a relevant estimation of an object's attitude, in terms of position and of orientation, when nothing more than inertial data are available, but after an arbitrarily long phase of calibration, during which video data where available, occurred. the abstract text
Place, publisher, year, edition, pages
Trita-CSC-E, ISSN 1653-5715 ; 2011:044
IdentifiersURN: urn:nbn:se:kth:diva-130681OAI: oai:DiVA.org:kth-130681DiVA: diva2:654128
Master of Science in Engineering - Computer Science and Technology