Fusion of Point Clouds derived from Aerial Images

Andreas Schönfelder, Karlheinz Gutjahr, Roland Perko, Mathias Schardt

Research output: Contribution to conferencePaper


State of the art dense image matching in combination with advances in camera technology enables the reconstruction of scenes in a novel high spatial resolution and offers new mapping potential. This work presents a strategy for fusing highly redundant disparity maps by applying a local filtering method to a set of classified and oriented 3D point clouds. The information obtained from stereo matching is enhanced by computing a set of normal maps and by classifying the disparity maps in quality classes based on total variation. With this information given, a filtering method is applied that fuses the oriented point clouds along the surface normals of the 3D geometry. The proposed fusion strategy aims at the reduction of point cloud artifacts while generating a non-redundant surface representation, which prioritize high quality disparities. The potential of the fusion method is evaluated based on airborne imagery (oblique and nadir) by using reference data from terrestrial laser scanners.
Original languageEnglish
Number of pages144
Publication statusPublished - 2017
EventOAGM and ARW Joint Workshop - Palais Eschenbach, Wien, Austria
Duration: 10 May 201712 May 2017


ConferenceOAGM and ARW Joint Workshop

Fingerprint Dive into the research topics of 'Fusion of Point Clouds derived from Aerial Images'. Together they form a unique fingerprint.

Cite this