Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
IMU-Camera Data Fusion: Horizontal Plane Observation with Explicit Outlier Rejection
KTH, School of Electrical Engineering (EES), Signal Processing. KTH, School of Electrical Engineering (EES), Centres, ACCESS Linnaeus Centre.
KTH, School of Electrical Engineering (EES), Signal Processing. KTH, School of Electrical Engineering (EES), Centres, ACCESS Linnaeus Centre.ORCID iD: 0000-0002-6855-5868
University of Illinois at Urbana-Champaign, Urbana, IL, USA.
2013 (English)In: 2013 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2013, IEEE Computer Society, 2013, 6817890- p.Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we address the problem of egomotion estimation using an inertial measurement unit and visual observations of planar features on the ground. The main practical difficulty of such a system is correctly determining the ground planar features from the visual observations. Herein, we propose a novel vision-aided inertial navigation system through simultaneous motion estimation and ground plane feature detection. We present a state-space formulation for the pose estimation problem and solve it via an augmented unscented Kalman filter. First, the predictions obtained by the Kalman filter are used to detect the ground plane features. Second, the detected features are fed back to the motion estimation algorithm to be used in the measurement update phase of the filter. The developed detection algorithm consists of two steps, namely homography-based and normal-based outlier rejection. The presented integration algorithm allows 6-DoF motion estimation in a practical scenario where the camera is not restricted to observe only the ground plane. Real-world experiments in an indoor scenario indicate the accuracy and reliability of our proposed method in the presence of outliers and non-ground obstacles.

Place, publisher, year, edition, pages
IEEE Computer Society, 2013. 6817890- p.
Series
International Conference on Indoor Positioning and Indoor Navigation, ISSN 2162-7347
Keyword [en]
Cameras, Computer vision, Data fusion, Feature extraction, State space methods, Statistics, Units of measurement, Ego-motion estimation, Inertial measurement unit, Integration algorithm, Motion estimation algorithm, Real world experiment, State space formulation, Unscented Kalman Filter, Vision-aided inertial navigation
National Category
Control Engineering
Identifiers
URN: urn:nbn:se:kth:diva-129260DOI: 10.1109/IPIN.2013.6817890ISI: 000341663400051Scopus ID: 2-s2.0-84902148018ISBN: 978-147994043-1 (print)OAI: oai:DiVA.org:kth-129260DiVA: diva2:651186
Conference
2013 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2013; Montbeliard-Belfort; France; 28 October 2013 through 31 October 2013
Note

QC 20140912

Available from: 2013-09-24 Created: 2013-09-24 Last updated: 2014-10-20Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Authority records BETA

Jansson, Magnus

Search in DiVA

By author/editor
Panahandeh, GhazalehJansson, Magnus
By organisation
Signal ProcessingACCESS Linnaeus Centre
Control Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 184 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf