This paper investigates the problem of visual-inertialnavigation. The proposed navigation system integrates inertialinformation from an inertial measurement unit (IMU) withvisual data from a camera to provide relative pose estimationfor a system which is navigating in an unknown structuralenvironment. The main contribution of this paper is derivationof a novel measurement model based on inertial data and visualplanar features. The proposed formulation is a solution to the6-DoF motion estimation where the IMU-camera movement isnot restricted over a desired navigation plane. Compared toprevious works, which are restricted on using only horizontalplane features, the proposed model is generalized for arbitraryplanar features. The theoretical finding of this study is extensivelyevaluated both with simulation and real world experiments. Thepresented experiments indicate the reliability of the proposedmethod to perform accurate 6-DoF pose estimation.
QC 20150828