Fusion of Point Clouds derived from Aerial Images

Andreas Schönfelder, Karlheinz Gutjahr, Roland Perko, Mathias Schardt

Research output: Contribution to conferencePaperpeer-review

Abstract

State of the art dense image matching in combination with advances in camera technology enables the reconstruction of scenes in a novel high spatial resolution and offers new mapping potential. This work presents a strategy for fusing highly redundant disparity maps by applying a local filtering method to a set of classified and oriented 3D point clouds. The information obtained from stereo matching is enhanced by computing a set of normal maps and by classifying the disparity maps in quality classes based on total variation. With this information given, a filtering method is applied that fuses the oriented point clouds along the surface normals of the 3D geometry. The proposed fusion strategy aims at the reduction of point cloud artifacts while generating a non-redundant surface representation, which prioritize high quality disparities. The potential of the fusion method is evaluated based on airborne imagery (oblique and nadir) by using reference data from terrestrial laser scanners.
Original languageEnglish
Pages139
Number of pages144
Publication statusPublished - 2017
EventOAGM/AAPR ARW 2017: Joint Workshop on “Vision, Automation & Robotics” - Palais Eschenbach, Wien, Austria
Duration: 10 May 201712 May 2017
http://www.roboticsworkshop.at/index.php

Conference

ConferenceOAGM/AAPR ARW 2017
Abbreviated titleOAGM/AAPR ARW 2017
Country/TerritoryAustria
CityWien
Period10/05/1712/05/17
Internet address

Cite this