Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Unified multi-lateral filter for real-time depth map enhancement
Show others and affiliations
2015 (English)In: Image and Vision Computing, ISSN 0262-8856, E-ISSN 1872-8138, Vol. 41, p. 26-41Article in journal (Refereed) Published
Abstract [en]

This paper proposes a unified multi-lateral filter to efficiently increase the spatial resolution of low-resolution and noisy depth maps in real-time. Time-of-Flight (ToF) cameras have become a very promising alternative to stereo-based range sensing systems as they provide depth measurements at a high frame rate. However, there are actually two main drawbacks that restrict their use in a wide range of applications; namely, their fairly low spatial resolution as well as the amount of noise within the depth estimation. In order to address these drawbacks, we propose a new approach based on sensor fusion. That is, we couple a ToF camera of low-resolution with a 2-D camera of higher resolution to which the low-resolution depth map will be efficiently upsampled. In this paper, we first review the existing depth map enhancement approaches based on sensor fusion and discuss their limitations. We then propose a unified multi-lateral filter that accounts for the inaccuracy of depth edges position due to the low-resolution ToF depth maps. By doing so, unwanted artefacts such as texture copying and edge blurring are almost entirely eliminated. Moreover, the proposed filter is configurable to behave as most of the alternative depth enhancement approaches. Using a convolution-based formulation and data quantization and downsampling, the described filter has been effectively and efficiently implemented for dynamic scenes in real-time applications. The experimental results show a sensitive qualitative as well as quantitative improvement on raw depth maps, outperforming state-of-the-art multi-lateral filters. © 2015 Elsevier B.V. All rights reserved.

Place, publisher, year, edition, pages
Elsevier, 2015. Vol. 41, p. 26-41
Keywords [en]
Depth enhancement; Data fusion; Sensor fusion; Multi-modal sensors; Adaptive filters; Active sensing; Time-of-Flight
National Category
Signal Processing
Identifiers
URN: urn:nbn:se:kth:diva-259156DOI: 10.1016/j.imavis.2015.06.008ISI: 000360595600003Scopus ID: 2-s2.0-84937560442OAI: oai:DiVA.org:kth-259156DiVA, id: diva2:1350600
Note

QC 20191024

Available from: 2019-09-11 Created: 2019-09-11 Last updated: 2019-10-24Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopushttp://www.sciencedirect.com/science/article/pii/S0262885615000773

Search in DiVA

By author/editor
Ottersten, Björn
In the same journal
Image and Vision Computing
Signal Processing

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 4 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf