A realistic benchmark for visual indoor place recognition
2010 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 0921-8830, Vol. 58, no 1, 81-96 p.Article in journal (Refereed) Published
An important competence for a mobile robot system is the ability to localize and perform context interpretation. This is required to perform basic navigation and to facilitate local specific services. Recent advances in vision have made this modality a viable alternative to the traditional range sensors, and visual place recognition algorithms emerged as a useful and widely applied tool for obtaining information about robot's position. Several place recognition methods have been proposed using vision alone or combined with sonar and/or laser. This research calls for standard benchmark datasets for development, evaluation and comparison of solutions. To this end, this paper presents two carefully designed and annotated image databases augmented with an experimental procedure and extensive baseline evaluation. The databases were gathered in an uncontrolled indoor office environment using two mobile robots and a standard camera. The acquisition spanned across a time range of several months and different illumination and weather conditions. Thus, the databases are very well suited for evaluating the robustness of algorithms with respect to a broad range of variations, often occurring in real-world settings. We thoroughly assessed the databases with a purely appearance-based place recognition method based on support vector machines and two types of rich visual features (global and local).
Place, publisher, year, edition, pages
2010. Vol. 58, no 1, 81-96 p.
Visual place recognition, Robot topological localization, Standard, robotic benchmark, localization, appearance, map
Computer and Information Science
IdentifiersURN: urn:nbn:se:kth:diva-19070DOI: 10.1016/j.robot.2009.07.025ISI: 000272968200008ScopusID: 2-s2.0-70450221694OAI: oai:DiVA.org:kth-19070DiVA: diva2:337117
FunderSwedish Research Council, 2005-3600-Complex
QC 201005252010-08-052010-08-052011-12-09Bibliographically approved