kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Observability-Aware Online Multi-Lidar Extrinsic Calibration
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering. Scan CV AB, S-15132 Södertälje, Sweden..ORCID iD: 0000-0002-7528-1383
Scan CV AB, S-15132 Södertälje, Sweden..
Oxford Robot Inst, Oxford OX2 6NN, England..
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-2638-6047
2023 (English)In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 8, no 5, p. 2860-2867Article in journal (Refereed) Published
Abstract [en]

Accurate and robust extrinsic calibration is necessary for deploying autonomous systems which need multiple sensors for perception. In this letter, we present a robust system for real-time extrinsic calibration of multiple lidars in vehicle base framewithout the need for any fiducialmarkers or features. We base our approach on matching absolute GNSS (Global Navigation Satellite System) and estimated lidar poses in real-time. Comparing rotation components allows us to improve the robustness of the solution than traditional least-square approach comparing translation components only. Additionally, instead of comparing all corresponding poses, we select poses comprising maximum mutual information based on our novel observability criteria. This allows us to identify a subset of the poses helpful for real-time calibration. We also provide stopping criteria for ensuring calibration completion. To validate our approach extensive tests were carried out on data collected using Scania test vehicles (7 sequences for a total of approximate to 6.5 Km). The results presented in this letter show that our approach is able to accurately determine the extrinsic calibration for various combinations of sensor setups.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. Vol. 8, no 5, p. 2860-2867
Keywords [en]
Calibration and identification, autonomous vehicle navigation, sensor fusion
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
URN: urn:nbn:se:kth:diva-326661DOI: 10.1109/LRA.2023.3262176ISI: 000964797800011Scopus ID: 2-s2.0-85151572135OAI: oai:DiVA.org:kth-326661DiVA, id: diva2:1755414
Note

QC 20230508

Available from: 2023-05-08 Created: 2023-05-08 Last updated: 2024-02-13Bibliographically approved
In thesis
1. State estimation with auto-calibrated sensor setup
Open this publication in new window or tab >>State estimation with auto-calibrated sensor setup
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Localization and mapping is one of the key aspects of driving autonomously in unstructured environments. Often such vehicles are equipped with multiple sensor modalities to create a 360o sensing coverage and add redundancy to handle sensor dropout scenarios. As the vehicles operate in underground mining and dense urban environments the Global navigation satellite system (GNSS) is often unreliable. Hence, to create a robust localization system different sensor modalities like camera, lidar and IMU are used along with a GNSS solution. The system must handle sensor dropouts and work in real-time (~15 Hz), so that there is enough computation budget left for other tasks like planning and control. Additionally, precise localization is also needed to map the environment, which may be later used for re-localization of the autonomous vehicles as well. Finally, for all of these to work seamlessly, accurate calibration of the sensors is of utmost importance.

In this PhD thesis, first, a robust system for state estimation that fuses measurements from multiple lidars and inertial sensors with GNSS data is presented. State estimation was performed in real-time, which produced robust motion estimates in a global frame by fusing lidar and IMU signals with GNSS components using a factor graph framework. The proposed method handled signal loss with a novel synchronization and fusion mechanism. To validate the approach extensive tests were carried out on data collected using Scania test vehicles (5 sequences for a total of ~ 7 Km). An average improvement of 61% in relative translation and 42% rotational error compared to a state-of-the-art estimator fusing a single lidar/inertial sensor pair is reported.  

Since precise calibration is needed for the localization and mapping tasks, in this thesis, methods for real-time calibration of the sensor setup is proposed. First, a method is proposed to calibrate sensors with non-overlapping field-of-view. The calibration quality is verified by mapping known features in the environment. Nevertheless, the verification process was not real-time and no observability analysis was performed which could give us an indicator of the analytical traceability of the trajectory required for motion-based online calibration. Hence, a new method is proposed where calibration and verification were performed in real-time by matching estimated sensor poses in real-time with observability analysis. Both of these methods relied on estimating the sensor poses using the state estimator developed in our earlier works. However, state estimators have inherent drifts and they are computationally intensive as well. Thus, another novel method is developed where the sensors could be calibrated in real-time without the need for any state estimation. 

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2024. p. 151
Series
TRITA-EECS-AVL ; 2024:8
Keywords
SLAM, Sensor calibration, Autonomous driving
National Category
Signal Processing Robotics
Research subject
Electrical Engineering
Identifiers
urn:nbn:se:kth:diva-343412 (URN)978-91-8040-806-6 (ISBN)
Public defence
2024-03-08, https://kth-se.zoom.us/s/63372097801, F3, Lindstedtsvägen 26, Stockholm, 13:00 (English)
Opponent
Supervisors
Funder
Swedish Foundation for Strategic Research
Note

QC 20240213

Available from: 2024-02-14 Created: 2024-02-12 Last updated: 2024-03-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Das, SandipanChatterjee, Saikat

Search in DiVA

By author/editor
Das, SandipanChatterjee, Saikat
By organisation
Information Science and Engineering
In the same journal
IEEE Robotics and Automation Letters
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 103 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf