Real-time panoramic tracking for event cameras

Christian Reinbacher, Gottfried Munda, Thomas Pock

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.

Original languageEnglish
Title of host publication2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers
ISBN (Electronic)9781509057450
DOIs
Publication statusPublished - 16 Jun 2017
Event2017 IEEE International Conference on Computational Photography, ICCP 2017: ICCP 2017 - Stanford, United States
Duration: 12 May 201714 May 2017

Conference

Conference2017 IEEE International Conference on Computational Photography, ICCP 2017
Abbreviated titleICCP 2017
Country/TerritoryUnited States
CityStanford
Period12/05/1714/05/17

ASJC Scopus subject areas

  • Instrumentation
  • Atomic and Molecular Physics, and Optics
  • Computational Theory and Mathematics
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Real-time panoramic tracking for event cameras'. Together they form a unique fingerprint.

Cite this