Time synchronization and data fusion for RGB-Depth cameras and inertial sensors in AAL applicationsShow others and affiliations
2015 (English)In: 2015 IEEE International Conference on Communication Workshop, ICCW 2015, IEEE conference proceedings, 2015, p. 265-270Conference paper, Published paper (Refereed)
Resource type
Text
Abstract [en]
Ambient Assisted Living applications often need to integrate data from multiple sensors, to provide consistent information on the observed phenomena. Data fusion based on samples from several sensors requires accurate time synchronization with sufficient resolution, depending on the sensor sampling frequency. This work presents a technical platform for the efficient and accurate synchronization of the data captured from RGB-Depth cameras and wearable inertial sensors, that can be integrated in AAL solutions. A case study of sensor data fusion for Timed Up and Go test is also presented and discussed.
Place, publisher, year, edition, pages
IEEE conference proceedings, 2015. p. 265-270
Keywords [en]
data fusion, depth camera, inertial sensor, synchronization, timed up and go, Cameras, Functional assessment, Inertial navigation systems, Wearable sensors, Wearable technology, Ambient assisted living, Multiple sensors, Sampling frequencies, Time synchronization, Sensor data fusion
National Category
Signal Processing Other Engineering and Technologies
Identifiers
URN: urn:nbn:se:kth:diva-181539DOI: 10.1109/ICCW.2015.7247189ISI: 000380459900044Scopus ID: 2-s2.0-84947751787ISBN: 9781467363051 (print)OAI: oai:DiVA.org:kth-181539DiVA, id: diva2:900585
Conference
IEEE International Conference on Communication Workshop, ICCW 2015, 8 June 2015 through 12 June 2015
Note
QC 20160204
2016-02-042016-02-022025-02-18Bibliographically approved
In thesis