kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Asynchronous Corner Tracking Algorithm Based on Lifetime of Events for DAVIS Cameras
Show others and affiliations
2020 (English)In: 15th International Symposium on Visual Computing, ISVC 2020, Springer Science and Business Media Deutschland GmbH , 2020, p. 530-541Conference paper, Published paper (Refereed)
Abstract [en]

Event cameras, i.e., the Dynamic and Active-pixel Vision Sensor (DAVIS) ones, capture the intensity changes in the scene and generates a stream of events in an asynchronous fashion. The output rate of such cameras can reach up to 10 million events per second in high dynamic environments. DAVIS cameras use novel vision sensors that mimic human eyes. Their attractive attributes, such as high output rate, High Dynamic Range (HDR), and high pixel bandwidth, make them an ideal solution for applications that require high-frequency tracking. Moreover, applications that operate in challenging lighting scenarios can exploit from the high HDR of event cameras, i.e., 140 dB compared to 60 dB of traditional cameras. In this paper, a novel asynchronous corner tracking method is proposed that uses both events and intensity images captured by a DAVIS camera. The Harris algorithm is used to extract features, i.e., frame-corners from keyframes, i.e., intensity images. Afterward, a matching algorithm is used to extract event-corners from the stream of events. Events are solely used to perform asynchronous tracking until the next keyframe is captured. Neighboring events, within a window size of 5 × 5 pixels around the event-corner, are used to calculate the velocity and direction of extracted event-corners by fitting the 2D planar using a randomized Hough transform algorithm. Experimental evaluation showed that our approach is able to update the location of the extracted corners up to 100 times during the blind time of traditional cameras, i.e., between two consecutive intensity images.

Place, publisher, year, edition, pages
Springer Science and Business Media Deutschland GmbH , 2020. p. 530-541
Keywords [en]
Asynchronous tracking, Corner, Event cameras, Hough transform, Lifetime, Hough transforms, Pixels, Tracking (position), Experimental evaluation, Harris algorithm, High dynamic range, High frequency HF, Intensity change, Matching algorithm, Randomized Hough transform, Tracking algorithm, Cameras
National Category
Computer graphics and computer vision
Identifiers
URN: urn:nbn:se:kth:diva-302932DOI: 10.1007/978-3-030-64556-4_41Scopus ID: 2-s2.0-85098193664OAI: oai:DiVA.org:kth-302932DiVA, id: diva2:1599882
Conference
5 October 2020 through 7 October 2020
Note

QC 20211002

Available from: 2021-10-02 Created: 2021-10-02 Last updated: 2025-02-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Tenhunen, Hannu

Search in DiVA

By author/editor
Tenhunen, Hannu
By organisation
Integrated devices and circuits
Computer graphics and computer vision

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 24 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf