Real-time panoramic tracking for event cameras

Christian Reinbacher, Gottfried Munda, Thomas Pock

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.

LanguageEnglish
Title of host publication2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers
ISBN (Electronic)9781509057450
DOIs
StatusPublished - 16 Jun 2017
Event2017 IEEE International Conference on Computational Photography, ICCP 2017 - Stanford, United States
Duration: 12 May 201714 May 2017

Conference

Conference2017 IEEE International Conference on Computational Photography, ICCP 2017
Abbreviated titleICCP 2017
CountryUnited States
CityStanford
Period12/05/1714/05/17

Fingerprint

Cameras
cameras
degrees of freedom
formulations
shift
sensors
Sensors

ASJC Scopus subject areas

  • Instrumentation
  • Atomic and Molecular Physics, and Optics
  • Computational Theory and Mathematics
  • Computer Vision and Pattern Recognition

Cite this

Reinbacher, C., Munda, G., & Pock, T. (2017). Real-time panoramic tracking for event cameras. In 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings [7951488] Institute of Electrical and Electronics Engineers. DOI: 10.1109/ICCPHOT.2017.7951488

Real-time panoramic tracking for event cameras. / Reinbacher, Christian; Munda, Gottfried; Pock, Thomas.

2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings. Institute of Electrical and Electronics Engineers, 2017. 7951488.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Reinbacher, C, Munda, G & Pock, T 2017, Real-time panoramic tracking for event cameras. in 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings., 7951488, Institute of Electrical and Electronics Engineers, 2017 IEEE International Conference on Computational Photography, ICCP 2017, Stanford, United States, 12/05/17. DOI: 10.1109/ICCPHOT.2017.7951488
Reinbacher C, Munda G, Pock T. Real-time panoramic tracking for event cameras. In 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings. Institute of Electrical and Electronics Engineers. 2017. 7951488. Available from, DOI: 10.1109/ICCPHOT.2017.7951488
Reinbacher, Christian ; Munda, Gottfried ; Pock, Thomas. / Real-time panoramic tracking for event cameras. 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings. Institute of Electrical and Electronics Engineers, 2017.
@inproceedings{902c032214ec4aacb0ac54c928a2ddcc,
title = "Real-time panoramic tracking for event cameras",
abstract = "Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.",
author = "Christian Reinbacher and Gottfried Munda and Thomas Pock",
year = "2017",
month = "6",
day = "16",
doi = "10.1109/ICCPHOT.2017.7951488",
language = "English",
booktitle = "2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers",
address = "United States",

}

TY - GEN

T1 - Real-time panoramic tracking for event cameras

AU - Reinbacher,Christian

AU - Munda,Gottfried

AU - Pock,Thomas

PY - 2017/6/16

Y1 - 2017/6/16

N2 - Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.

AB - Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.

UR - http://www.scopus.com/inward/record.url?scp=85025476941&partnerID=8YFLogxK

U2 - 10.1109/ICCPHOT.2017.7951488

DO - 10.1109/ICCPHOT.2017.7951488

M3 - Conference contribution

BT - 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers

ER -