Ego-Motion Correction based on Static Objects detected by an Automotive Lidar Sensor System

Conference: AmE 2019 – Automotive meets Electronics - 10. GMM-Fachtagung
03/12/2019 - 03/13/2019 at Dortmund, Deutschland

Proceedings: GMM-Fb. 93: AmE 2019

Pages: 6Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Authors:
Stannartz, Niklas; Wissing, Christian; Bertram, Torsten (TU Dortmund University, Institute of Control Theory and Systems Engineering, 44227 Dortmund, Germany)
Krueger, Martin; Tolmidis, Avraam; Ali, Syed Irtiza; Nattermann, Till (ZF, Automated Driving & Integral Cognitive Safety, 40547 Düsseldorf, Germany)

Abstract:
This paper presents a framework for ego-motion correction exploiting static objects in the environment detected by an automotive lidar sensor system. Many previous works tackle this problem using raw point cloud data yielding an improved performance with respect to dead reckoning (DR) at the cost of neglecting embedded object tracking algorithms integrated in automotive sensors while increasing the computational demands. Since exteroceptive sensors (cameras, radars and lidars) typically segment distinct objects of the environment from the raw measurements, the use of classified static objects resemble a natural choice for an egomotion correction reducing the computational overhead in parallel. The egomotion correction is based on an observability-constrained Unscented Kalman Filter (OCUKF) exploiting detected pointand line-shaped static objects that are temporally maintained in a local map of the visible surroundings. For the latter type a measurement model for continuous line-shaped landmarks is proposed that tackles disadvantages of the commonly used model in the literature. Finally, experimental results show the improved ego-motion estimation of the OCUKF algorithm with respect to DR.