Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction

Louis Lecrosnier, Rémi Boutteau, Pascal Vasseur, Xavier Savatier, Friedrich Fraundorfer

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

Common approaches for vehicle localization pro-pose to match LiDAR data or 2D features from cameras to a prior 3D LiDAR map. Yet, these methods require both heavy computational power often provided by GPU, and a first rough localization estimate via GNSS to be performed online. Moreover, storing and accessing 3D dense LiDAR maps can be challenging in case of city-wide coverage.In this paper, we address the problem of camera global relocalization in a prior 3D line-feature map from a single image, in a GNSS denied context and with no prior pose estimation. We propose a dual contribution.(1) We introduce a novel pose estimation method from lines, (i.e. Perspective-n-Line or PnL), with a known vertical direction. Our method benefits a Gauss-Newton optimization scheme to compensate the sensor-induced vertical direction errors, and refine the overall pose. Our algorithm requires at least 3 lines to output a pose (P3L) and requires no reformulation to operate with a higher number of lines.(2) We propose a RANSAC (RANdom SAmple Consensus) 2D-3D line matching and outliers removal algorithm requiring solely one 2D-3D line pair to operate, i.e. RANSAC1. Our method reduces the number of iteration required to match features and can be easily modified to exhaustively test all feature combinations.We evaluate the robustness of our algorithms with a synthetic data, and on a challenging sub-sequence of the KITTI dataset.

Original languageEnglish
Title of host publication2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
PublisherInstitute of Electrical and Electronics Engineers
Pages1263-1269
Number of pages7
ISBN (Electronic)9781538670248
DOIs
Publication statusPublished - 1 Oct 2019
Event2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019 - Auckland, New Zealand
Duration: 27 Oct 201930 Oct 2019

Publication series

Name2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019

Conference

Conference2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
CountryNew Zealand
CityAuckland
Period27/10/1930/10/19

Fingerprint

vehicles
Cameras
cameras
newton
iteration
random sample
optimization
output
coverage
sensors
Sensors
estimates
Localization
Graphics processing unit

ASJC Scopus subject areas

  • Artificial Intelligence
  • Management Science and Operations Research
  • Instrumentation
  • Transportation

Cite this

Lecrosnier, L., Boutteau, R., Vasseur, P., Savatier, X., & Fraundorfer, F. (2019). Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction. In 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019 (pp. 1263-1269). [8916886] (2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019). Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/ITSC.2019.8916886

Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction. / Lecrosnier, Louis; Boutteau, Rémi; Vasseur, Pascal; Savatier, Xavier; Fraundorfer, Friedrich.

2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019. Institute of Electrical and Electronics Engineers, 2019. p. 1263-1269 8916886 (2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019).

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Lecrosnier, L, Boutteau, R, Vasseur, P, Savatier, X & Fraundorfer, F 2019, Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction. in 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019., 8916886, 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019, Institute of Electrical and Electronics Engineers, pp. 1263-1269, 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019, Auckland, New Zealand, 27/10/19. https://doi.org/10.1109/ITSC.2019.8916886
Lecrosnier L, Boutteau R, Vasseur P, Savatier X, Fraundorfer F. Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction. In 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019. Institute of Electrical and Electronics Engineers. 2019. p. 1263-1269. 8916886. (2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019). https://doi.org/10.1109/ITSC.2019.8916886
Lecrosnier, Louis ; Boutteau, Rémi ; Vasseur, Pascal ; Savatier, Xavier ; Fraundorfer, Friedrich. / Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction. 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019. Institute of Electrical and Electronics Engineers, 2019. pp. 1263-1269 (2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019).
@inproceedings{0fef2cfc46fb44898f170d4a00e71374,
title = "Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction",
abstract = "Common approaches for vehicle localization pro-pose to match LiDAR data or 2D features from cameras to a prior 3D LiDAR map. Yet, these methods require both heavy computational power often provided by GPU, and a first rough localization estimate via GNSS to be performed online. Moreover, storing and accessing 3D dense LiDAR maps can be challenging in case of city-wide coverage.In this paper, we address the problem of camera global relocalization in a prior 3D line-feature map from a single image, in a GNSS denied context and with no prior pose estimation. We propose a dual contribution.(1) We introduce a novel pose estimation method from lines, (i.e. Perspective-n-Line or PnL), with a known vertical direction. Our method benefits a Gauss-Newton optimization scheme to compensate the sensor-induced vertical direction errors, and refine the overall pose. Our algorithm requires at least 3 lines to output a pose (P3L) and requires no reformulation to operate with a higher number of lines.(2) We propose a RANSAC (RANdom SAmple Consensus) 2D-3D line matching and outliers removal algorithm requiring solely one 2D-3D line pair to operate, i.e. RANSAC1. Our method reduces the number of iteration required to match features and can be easily modified to exhaustively test all feature combinations.We evaluate the robustness of our algorithms with a synthetic data, and on a challenging sub-sequence of the KITTI dataset.",
author = "Louis Lecrosnier and R{\'e}mi Boutteau and Pascal Vasseur and Xavier Savatier and Friedrich Fraundorfer",
year = "2019",
month = "10",
day = "1",
doi = "10.1109/ITSC.2019.8916886",
language = "English",
series = "2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019",
publisher = "Institute of Electrical and Electronics Engineers",
pages = "1263--1269",
booktitle = "2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019",
address = "United States",

}

TY - GEN

T1 - Vision based vehicle relocalization in 3D line-feature map using Perspective-n-Line with a known vertical direction

AU - Lecrosnier, Louis

AU - Boutteau, Rémi

AU - Vasseur, Pascal

AU - Savatier, Xavier

AU - Fraundorfer, Friedrich

PY - 2019/10/1

Y1 - 2019/10/1

N2 - Common approaches for vehicle localization pro-pose to match LiDAR data or 2D features from cameras to a prior 3D LiDAR map. Yet, these methods require both heavy computational power often provided by GPU, and a first rough localization estimate via GNSS to be performed online. Moreover, storing and accessing 3D dense LiDAR maps can be challenging in case of city-wide coverage.In this paper, we address the problem of camera global relocalization in a prior 3D line-feature map from a single image, in a GNSS denied context and with no prior pose estimation. We propose a dual contribution.(1) We introduce a novel pose estimation method from lines, (i.e. Perspective-n-Line or PnL), with a known vertical direction. Our method benefits a Gauss-Newton optimization scheme to compensate the sensor-induced vertical direction errors, and refine the overall pose. Our algorithm requires at least 3 lines to output a pose (P3L) and requires no reformulation to operate with a higher number of lines.(2) We propose a RANSAC (RANdom SAmple Consensus) 2D-3D line matching and outliers removal algorithm requiring solely one 2D-3D line pair to operate, i.e. RANSAC1. Our method reduces the number of iteration required to match features and can be easily modified to exhaustively test all feature combinations.We evaluate the robustness of our algorithms with a synthetic data, and on a challenging sub-sequence of the KITTI dataset.

AB - Common approaches for vehicle localization pro-pose to match LiDAR data or 2D features from cameras to a prior 3D LiDAR map. Yet, these methods require both heavy computational power often provided by GPU, and a first rough localization estimate via GNSS to be performed online. Moreover, storing and accessing 3D dense LiDAR maps can be challenging in case of city-wide coverage.In this paper, we address the problem of camera global relocalization in a prior 3D line-feature map from a single image, in a GNSS denied context and with no prior pose estimation. We propose a dual contribution.(1) We introduce a novel pose estimation method from lines, (i.e. Perspective-n-Line or PnL), with a known vertical direction. Our method benefits a Gauss-Newton optimization scheme to compensate the sensor-induced vertical direction errors, and refine the overall pose. Our algorithm requires at least 3 lines to output a pose (P3L) and requires no reformulation to operate with a higher number of lines.(2) We propose a RANSAC (RANdom SAmple Consensus) 2D-3D line matching and outliers removal algorithm requiring solely one 2D-3D line pair to operate, i.e. RANSAC1. Our method reduces the number of iteration required to match features and can be easily modified to exhaustively test all feature combinations.We evaluate the robustness of our algorithms with a synthetic data, and on a challenging sub-sequence of the KITTI dataset.

UR - http://www.scopus.com/inward/record.url?scp=85076818589&partnerID=8YFLogxK

U2 - 10.1109/ITSC.2019.8916886

DO - 10.1109/ITSC.2019.8916886

M3 - Conference contribution

T3 - 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019

SP - 1263

EP - 1269

BT - 2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019

PB - Institute of Electrical and Electronics Engineers

ER -