Extracting Quantitative Descriptions of Pedestrian Pre-crash Postures from Real-world Accident Videos

Martin Schachner*, Bernd Schneider, Wolfgang Sinz, Corina Klug

*Korrespondierende/r Autor/in für diese Arbeit

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem Konferenzband

Abstract

Real-world accident videos represent a promising data source to investigate pedestrian pre-collision behaviour, significantly affecting in-crash kinematics. However, this data source presents challenges in quantifying behaviours and providing pose descriptions revealing joint locations and angles. This study investigated this issue and introduces a method for quantitative evaluation by extracting 3D poses from real-world accidents. The method combines common computer vision approaches and optimisation techniques to align a 3D human model to 2D pose information, extracted from the videos. The capabilities of the method were assessed by applying it to a dataset, holding measured ground truth data. To further demonstrate the method’scapabilities on real-world accident videos, a dataset was created from publicly available sources. Three videoswere selected from the dataset, showing typical pedestrian pre-collision reactions, such as raising arms or leaning back, before the introduced method was applied to reconstruct the 3D joint positions and angles for multiple video frames prior to impact. The results emphasise that accident videos can be used to obtain quantitative pedestrian postures and moving patterns, required to derive realistic boundary conditions for pedestrian simulations.
Originalspracheenglisch
Titel2020 IRCOBI Europe Conference
ErscheinungsortMunich
Herausgeber (Verlag)International Research Council on the Biomechanics of Injury
Seiten231-249
Seitenumfang19
ISBN (Print)2235-3151
PublikationsstatusVeröffentlicht - 1 Sep 2020

Fingerprint Untersuchen Sie die Forschungsthemen von „Extracting Quantitative Descriptions of Pedestrian Pre-crash Postures from Real-world Accident Videos“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren