High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution

Hannes Plank, Gerald Holweg, Thomas Herndl, Norbert Druml

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

In recent years, depth sensing systems have gained popularity and have begun to appear on the consumer market. Of these systems, PMD-based Time-of-Flight cameras are the smallest available and will soon be integrated into mobile devices such as smart phones and tablets. Like all other available depth sensing systems, PMD-based Time-of-Flight cameras do not produce perfect depth data. Because of the sensor's characteristics, the data is noisy and the resolution is limited. Fast movements cause motion artifacts, which are undefined depth values due to corrupted measurements. Combining the data of a Time-of-Flight and a color camera can compensate these flaws and vastly improve depth image quality. This work uses color edge information as a guide so the depth image is upscaled with resolution gain and lossless noise reduction. A novel depth upscaling method is introduced, combining the creation of high quality depth data with fast execution. A high end smart phone development board, a color, and a Timeof- Flight camera are used to create a sensor fusion prototype. The complete processing pipeline is efficiently implemented on the graphics processing unit in order to maximize performance. The prototype proves the feasibility of our proposed fusion method on mobile devices. The result is a system capable of fusing color and depth data at interactive frame rates. When there is depth information available for every color pixel, new possibilities in computer vision, augmented reality and computational photography arise. The evaluation shows, our sensor fusion solution provides depth images with upscaled resolution, increased sharpness, less noise, less motion artifacts, and achieves high frame rates at the same time; thus significantly outperforms state-of-the-art solutions.

LanguageEnglish
Title of host publicationProceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016
PublisherInstitute of Electrical and Electronics Engineers
Pages1213-1218
Number of pages6
ISBN (Electronic)9783981537062
StatusPublished - 25 Apr 2016
Event19th Design, Automation and Test in Europe Conference and Exhibition, DATE 2016 - Dresden, Germany
Duration: 14 Mar 201618 Mar 2016

Conference

Conference19th Design, Automation and Test in Europe Conference and Exhibition, DATE 2016
CountryGermany
CityDresden
Period14/03/1618/03/16

Fingerprint

Fusion reactions
Color
Cameras
Sensors
Polarization mode dispersion
Mobile devices
Augmented reality
Photography
Noise abatement
Image quality
Computer vision
Pipelines
Pixels
Defects
Processing

Keywords

  • 3D sensing
  • GPGPU
  • image processing
  • sensor fusion
  • Time-of-Flight

ASJC Scopus subject areas

  • Hardware and Architecture
  • Safety, Risk, Reliability and Quality

Cite this

Plank, H., Holweg, G., Herndl, T., & Druml, N. (2016). High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution. In Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016 (pp. 1213-1218). [7459496] Institute of Electrical and Electronics Engineers.

High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution. / Plank, Hannes; Holweg, Gerald; Herndl, Thomas; Druml, Norbert.

Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016. Institute of Electrical and Electronics Engineers, 2016. p. 1213-1218 7459496.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Plank, H, Holweg, G, Herndl, T & Druml, N 2016, High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution. in Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016., 7459496, Institute of Electrical and Electronics Engineers, pp. 1213-1218, 19th Design, Automation and Test in Europe Conference and Exhibition, DATE 2016, Dresden, Germany, 14/03/16.
Plank H, Holweg G, Herndl T, Druml N. High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution. In Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016. Institute of Electrical and Electronics Engineers. 2016. p. 1213-1218. 7459496.
Plank, Hannes ; Holweg, Gerald ; Herndl, Thomas ; Druml, Norbert. / High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution. Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016. Institute of Electrical and Electronics Engineers, 2016. pp. 1213-1218
@inproceedings{969578603533480bb9e0f8e54c0d75c0,
title = "High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution",
abstract = "In recent years, depth sensing systems have gained popularity and have begun to appear on the consumer market. Of these systems, PMD-based Time-of-Flight cameras are the smallest available and will soon be integrated into mobile devices such as smart phones and tablets. Like all other available depth sensing systems, PMD-based Time-of-Flight cameras do not produce perfect depth data. Because of the sensor's characteristics, the data is noisy and the resolution is limited. Fast movements cause motion artifacts, which are undefined depth values due to corrupted measurements. Combining the data of a Time-of-Flight and a color camera can compensate these flaws and vastly improve depth image quality. This work uses color edge information as a guide so the depth image is upscaled with resolution gain and lossless noise reduction. A novel depth upscaling method is introduced, combining the creation of high quality depth data with fast execution. A high end smart phone development board, a color, and a Timeof- Flight camera are used to create a sensor fusion prototype. The complete processing pipeline is efficiently implemented on the graphics processing unit in order to maximize performance. The prototype proves the feasibility of our proposed fusion method on mobile devices. The result is a system capable of fusing color and depth data at interactive frame rates. When there is depth information available for every color pixel, new possibilities in computer vision, augmented reality and computational photography arise. The evaluation shows, our sensor fusion solution provides depth images with upscaled resolution, increased sharpness, less noise, less motion artifacts, and achieves high frame rates at the same time; thus significantly outperforms state-of-the-art solutions.",
keywords = "3D sensing, GPGPU, image processing, sensor fusion, Time-of-Flight",
author = "Hannes Plank and Gerald Holweg and Thomas Herndl and Norbert Druml",
year = "2016",
month = "4",
day = "25",
language = "English",
pages = "1213--1218",
booktitle = "Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016",
publisher = "Institute of Electrical and Electronics Engineers",
address = "United States",

}

TY - GEN

T1 - High performance Time-of-Flight and color sensor fusion with image-guided depth super resolution

AU - Plank,Hannes

AU - Holweg,Gerald

AU - Herndl,Thomas

AU - Druml,Norbert

PY - 2016/4/25

Y1 - 2016/4/25

N2 - In recent years, depth sensing systems have gained popularity and have begun to appear on the consumer market. Of these systems, PMD-based Time-of-Flight cameras are the smallest available and will soon be integrated into mobile devices such as smart phones and tablets. Like all other available depth sensing systems, PMD-based Time-of-Flight cameras do not produce perfect depth data. Because of the sensor's characteristics, the data is noisy and the resolution is limited. Fast movements cause motion artifacts, which are undefined depth values due to corrupted measurements. Combining the data of a Time-of-Flight and a color camera can compensate these flaws and vastly improve depth image quality. This work uses color edge information as a guide so the depth image is upscaled with resolution gain and lossless noise reduction. A novel depth upscaling method is introduced, combining the creation of high quality depth data with fast execution. A high end smart phone development board, a color, and a Timeof- Flight camera are used to create a sensor fusion prototype. The complete processing pipeline is efficiently implemented on the graphics processing unit in order to maximize performance. The prototype proves the feasibility of our proposed fusion method on mobile devices. The result is a system capable of fusing color and depth data at interactive frame rates. When there is depth information available for every color pixel, new possibilities in computer vision, augmented reality and computational photography arise. The evaluation shows, our sensor fusion solution provides depth images with upscaled resolution, increased sharpness, less noise, less motion artifacts, and achieves high frame rates at the same time; thus significantly outperforms state-of-the-art solutions.

AB - In recent years, depth sensing systems have gained popularity and have begun to appear on the consumer market. Of these systems, PMD-based Time-of-Flight cameras are the smallest available and will soon be integrated into mobile devices such as smart phones and tablets. Like all other available depth sensing systems, PMD-based Time-of-Flight cameras do not produce perfect depth data. Because of the sensor's characteristics, the data is noisy and the resolution is limited. Fast movements cause motion artifacts, which are undefined depth values due to corrupted measurements. Combining the data of a Time-of-Flight and a color camera can compensate these flaws and vastly improve depth image quality. This work uses color edge information as a guide so the depth image is upscaled with resolution gain and lossless noise reduction. A novel depth upscaling method is introduced, combining the creation of high quality depth data with fast execution. A high end smart phone development board, a color, and a Timeof- Flight camera are used to create a sensor fusion prototype. The complete processing pipeline is efficiently implemented on the graphics processing unit in order to maximize performance. The prototype proves the feasibility of our proposed fusion method on mobile devices. The result is a system capable of fusing color and depth data at interactive frame rates. When there is depth information available for every color pixel, new possibilities in computer vision, augmented reality and computational photography arise. The evaluation shows, our sensor fusion solution provides depth images with upscaled resolution, increased sharpness, less noise, less motion artifacts, and achieves high frame rates at the same time; thus significantly outperforms state-of-the-art solutions.

KW - 3D sensing

KW - GPGPU

KW - image processing

KW - sensor fusion

KW - Time-of-Flight

UR - http://www.scopus.com/inward/record.url?scp=84973618780&partnerID=8YFLogxK

M3 - Conference contribution

SP - 1213

EP - 1218

BT - Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, DATE 2016

PB - Institute of Electrical and Electronics Engineers

ER -