Learning Depth Calibration of Time-of-Flight Cameras

David Ferstl, Christian Reinbacher, Gernot Riegler, Matthias Rüther, Horst Bischof

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

We present a novel method for an automatic calibration of modern consumer Time-of-Flight cameras. Usually, these sensors come equipped with an integrated color camera. Albeit they deliver acquisitions at high frame rates they usually suffer from incorrect calibration and low accuracy due to multiple error sources. Using information from both cameras together with a simple planar target, we will show how to accurately calibrate both color and depth camera, and tackle most error sources inherent to Time-of-Flight technology in a unified calibration framework. Automatic feature detection minimizes user interaction during calibration. We utilize a Random Regression Forest to optimize the manufacturer supplied depth measurements. We show the improvements to commonly used depth calibration methods in a qualitative and quantitative evaluation on multiple scenes acquired by an accurate reference system for the application of dense 3D reconstruction.
Original languageEnglish
Title of host publicationBritish Machine Vision Conference
Publisher.
DOIs
Publication statusAccepted/In press - 2015

Fingerprint

Cameras
Calibration
Color
Sensors

Fields of Expertise

  • Information, Communication & Computing

Cite this

Ferstl, D., Reinbacher, C., Riegler, G., Rüther, M., & Bischof, H. (Accepted/In press). Learning Depth Calibration of Time-of-Flight Cameras. In British Machine Vision Conference .. https://doi.org/10.5244/C.29.102

Learning Depth Calibration of Time-of-Flight Cameras. / Ferstl, David; Reinbacher, Christian; Riegler, Gernot; Rüther, Matthias; Bischof, Horst.

British Machine Vision Conference. ., 2015.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Ferstl, D, Reinbacher, C, Riegler, G, Rüther, M & Bischof, H 2015, Learning Depth Calibration of Time-of-Flight Cameras. in British Machine Vision Conference. . https://doi.org/10.5244/C.29.102
Ferstl D, Reinbacher C, Riegler G, Rüther M, Bischof H. Learning Depth Calibration of Time-of-Flight Cameras. In British Machine Vision Conference. . 2015 https://doi.org/10.5244/C.29.102
Ferstl, David ; Reinbacher, Christian ; Riegler, Gernot ; Rüther, Matthias ; Bischof, Horst. / Learning Depth Calibration of Time-of-Flight Cameras. British Machine Vision Conference. ., 2015.
@inproceedings{f5e8217788db4f63a8d024f2d491aef9,
title = "Learning Depth Calibration of Time-of-Flight Cameras",
abstract = "We present a novel method for an automatic calibration of modern consumer Time-of-Flight cameras. Usually, these sensors come equipped with an integrated color camera. Albeit they deliver acquisitions at high frame rates they usually suffer from incorrect calibration and low accuracy due to multiple error sources. Using information from both cameras together with a simple planar target, we will show how to accurately calibrate both color and depth camera, and tackle most error sources inherent to Time-of-Flight technology in a unified calibration framework. Automatic feature detection minimizes user interaction during calibration. We utilize a Random Regression Forest to optimize the manufacturer supplied depth measurements. We show the improvements to commonly used depth calibration methods in a qualitative and quantitative evaluation on multiple scenes acquired by an accurate reference system for the application of dense 3D reconstruction.",
author = "David Ferstl and Christian Reinbacher and Gernot Riegler and Matthias R{\"u}ther and Horst Bischof",
year = "2015",
doi = "10.5244/C.29.102",
language = "English",
booktitle = "British Machine Vision Conference",
publisher = ".",

}

TY - GEN

T1 - Learning Depth Calibration of Time-of-Flight Cameras

AU - Ferstl, David

AU - Reinbacher, Christian

AU - Riegler, Gernot

AU - Rüther, Matthias

AU - Bischof, Horst

PY - 2015

Y1 - 2015

N2 - We present a novel method for an automatic calibration of modern consumer Time-of-Flight cameras. Usually, these sensors come equipped with an integrated color camera. Albeit they deliver acquisitions at high frame rates they usually suffer from incorrect calibration and low accuracy due to multiple error sources. Using information from both cameras together with a simple planar target, we will show how to accurately calibrate both color and depth camera, and tackle most error sources inherent to Time-of-Flight technology in a unified calibration framework. Automatic feature detection minimizes user interaction during calibration. We utilize a Random Regression Forest to optimize the manufacturer supplied depth measurements. We show the improvements to commonly used depth calibration methods in a qualitative and quantitative evaluation on multiple scenes acquired by an accurate reference system for the application of dense 3D reconstruction.

AB - We present a novel method for an automatic calibration of modern consumer Time-of-Flight cameras. Usually, these sensors come equipped with an integrated color camera. Albeit they deliver acquisitions at high frame rates they usually suffer from incorrect calibration and low accuracy due to multiple error sources. Using information from both cameras together with a simple planar target, we will show how to accurately calibrate both color and depth camera, and tackle most error sources inherent to Time-of-Flight technology in a unified calibration framework. Automatic feature detection minimizes user interaction during calibration. We utilize a Random Regression Forest to optimize the manufacturer supplied depth measurements. We show the improvements to commonly used depth calibration methods in a qualitative and quantitative evaluation on multiple scenes acquired by an accurate reference system for the application of dense 3D reconstruction.

U2 - 10.5244/C.29.102

DO - 10.5244/C.29.102

M3 - Conference contribution

BT - British Machine Vision Conference

PB - .

ER -