kth.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Multi-modal curb detection and filtering
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap. Scania, Sweden.ORCID-id: 0000-0002-7528-1383
Scania, Sweden.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0003-2638-6047
Oxford Robotics Institute, UK.ORCID-id: 0000-0003-2940-0879
2022 (Engelska)Konferensbidrag, Poster (med eller utan abstract) (Övrigt vetenskapligt)
Abstract [en]

Reliable knowledge of road boundaries is critical for autonomous vehicle navigation. We propose a robust curb detection and filtering technique based on the fusion of camera semantics and dense lidar point clouds. The lidar point clouds are collected by fusing multiple lidars for robust feature detection. The camera semantics are based on a modified EfficientNet architecture which is trained with labeled data collected from onboard fisheye cameras. The point clouds are associated with the closest curb segment with L2-norm analysis after projecting into the image space with the fisheye model projection. Next, the selected points are clustered using unsupervised density-based spatial clustering to detect different curb regions. As new curb points are detected in consecutive frames they are associated with the existing curb clusters using temporal reachability constraints. If no reachability constraints are found a new curb cluster is formed from these new points. This ensures we can detect multiple curbs present in road segments consisting of multiple lanes if they are in the sensors' field of view. Finally, Delaunay filtering is applied for outlier removal and its performance is compared to traditional RANSAC-based filtering. An objective evaluation of the proposed solution is done using a high-definition map containing ground truth curb points obtained from a commercial map supplier. The proposed system has proven capable of detecting curbs of any orientation in complex urban road scenarios comprising straight roads, curved roads, and intersections with traffic isles. 

Ort, förlag, år, upplaga, sidor
2022.
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
URN: urn:nbn:se:kth:diva-343533OAI: oai:DiVA.org:kth-343533DiVA, id: diva2:1838405
Konferens
IEEE International Conference on Robotics and Automation (ICRA) Workshop: Robotic Perception and Mapping - Emerging Techniques, May 23, 2022, Philadelphia, USA
Anmärkning

QC 20240216

Tillgänglig från: 2024-02-16 Skapad: 2024-02-16 Senast uppdaterad: 2024-02-16Bibliografiskt granskad
Ingår i avhandling
1. State estimation with auto-calibrated sensor setup
Öppna denna publikation i ny flik eller fönster >>State estimation with auto-calibrated sensor setup
2024 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Localization and mapping is one of the key aspects of driving autonomously in unstructured environments. Often such vehicles are equipped with multiple sensor modalities to create a 360o sensing coverage and add redundancy to handle sensor dropout scenarios. As the vehicles operate in underground mining and dense urban environments the Global navigation satellite system (GNSS) is often unreliable. Hence, to create a robust localization system different sensor modalities like camera, lidar and IMU are used along with a GNSS solution. The system must handle sensor dropouts and work in real-time (~15 Hz), so that there is enough computation budget left for other tasks like planning and control. Additionally, precise localization is also needed to map the environment, which may be later used for re-localization of the autonomous vehicles as well. Finally, for all of these to work seamlessly, accurate calibration of the sensors is of utmost importance.

In this PhD thesis, first, a robust system for state estimation that fuses measurements from multiple lidars and inertial sensors with GNSS data is presented. State estimation was performed in real-time, which produced robust motion estimates in a global frame by fusing lidar and IMU signals with GNSS components using a factor graph framework. The proposed method handled signal loss with a novel synchronization and fusion mechanism. To validate the approach extensive tests were carried out on data collected using Scania test vehicles (5 sequences for a total of ~ 7 Km). An average improvement of 61% in relative translation and 42% rotational error compared to a state-of-the-art estimator fusing a single lidar/inertial sensor pair is reported.  

Since precise calibration is needed for the localization and mapping tasks, in this thesis, methods for real-time calibration of the sensor setup is proposed. First, a method is proposed to calibrate sensors with non-overlapping field-of-view. The calibration quality is verified by mapping known features in the environment. Nevertheless, the verification process was not real-time and no observability analysis was performed which could give us an indicator of the analytical traceability of the trajectory required for motion-based online calibration. Hence, a new method is proposed where calibration and verification were performed in real-time by matching estimated sensor poses in real-time with observability analysis. Both of these methods relied on estimating the sensor poses using the state estimator developed in our earlier works. However, state estimators have inherent drifts and they are computationally intensive as well. Thus, another novel method is developed where the sensors could be calibrated in real-time without the need for any state estimation. 

Ort, förlag, år, upplaga, sidor
KTH Royal Institute of Technology, 2024. s. 151
Serie
TRITA-EECS-AVL ; 2024:8
Nyckelord
SLAM, Sensor calibration, Autonomous driving
Nationell ämneskategori
Signalbehandling Robotteknik och automation
Forskningsämne
Elektro- och systemteknik
Identifikatorer
urn:nbn:se:kth:diva-343412 (URN)978-91-8040-806-6 (ISBN)
Disputation
2024-03-08, https://kth-se.zoom.us/s/63372097801, F3, Lindstedtsvägen 26, Stockholm, 13:00 (Engelska)
Opponent
Handledare
Forskningsfinansiär
Stiftelsen för strategisk forskning (SSF)
Anmärkning

QC 20240213

Tillgänglig från: 2024-02-14 Skapad: 2024-02-12 Senast uppdaterad: 2024-03-07Bibliografiskt granskad

Open Access i DiVA

fulltext(2789 kB)56 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 2789 kBChecksumma SHA-512
1bfa80f5d2a0115fbb022ad4e81fdb05554b45b3f5ddf74595df19070b1ed1a9fc97f2ccb0e04262ef33501b36404dd6d006d2fa711d231a33cad32fc5e5db5d
Typ fulltextMimetyp application/pdf

Övriga länkar

arXiv

Person

Das, SandipanChatterjee, Saikat

Sök vidare i DiVA

Av författaren/redaktören
Das, SandipanChatterjee, SaikatFallon, Maurice
Av organisationen
Teknisk informationsvetenskap
Datorseende och robotik (autonoma system)

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 58 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 221 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf