Multi-Sensor Fusion for Localization of a Mobile Robot in Outdoor Environments

Konferenz: ISR/ROBOTIK 2010 - ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics)
07.06.2010 - 09.06.2010 in Munich, Germany

Tagungsband: ISR/ROBOTIK 2010

Seiten: 6Sprache: EnglischTyp: PDF

Persönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt

Emter, Thomas; Salto?lu, Arda; Petereit, Janko (Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB), Systems for Measurement, Control and Diagnosis (MRD), Fraunhoferstr. 1, 76131 Karlsruhe, Germany)
Emter, Thomas (Lehrstuhl fuer Interaktive Echtzeitsysteme, Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT), Germany)

An essential key capability for a mobile robot to perform autonomous navigation is the ability to localize itself in its environment. The most basic way to perform localization is dead-reckoning, i. e., to use relative measuring sensors of the robot like odometry (wheel encoders) by incrementally incorporating the measured revolutions of the robots wheels from a known starting position. As these sensors only deliver relative measurements and all sensors are subjected to noise, the uncertainty of the pose grows boundlessly over the covered distance. In outdoor environments navigation sensors like GPS and compass are a viable option. They are measuring absolute quantities and therefore are not suffering from error accumulation but are prone to local disturbances by surrounding objects. The measurements of the compass are degraded by disturbances of the terrestrial magnetic field, e. g., by metal fences or ventilation fans of air condition systems. Using a low-cost differential GPS receiver, the significant remaining source of error is multipath propagation due to reflections and shadowing effects of large objects like buildings. As the reflections are dependent on the constellation of the receiver and the satellites relative to nearby reflecting surfaces the errors are time variant and locally varying. For precise self localization the combination of several sensors is essential as due to the noisy measurements no single sensor is sufficient. The data from the sensors is fused to a combined estimate resulting in a more accurate localization. A new Kalman filter based approach will be presented to perform multi-sensor fusion for on-line localization under realtime constraints. While for indoor applications of mobile robots a 2D localization usually is sufficient, as the robot typically operates on flat floors, a full 6 DoF estimation of position and attitude is necessary in outdoor environments where the assumption of a flat ground cannot be applied. To accomplish the 6 DoF estimation relative measuring sensors and absolute measuring sensors are combined by means of multi-sensor fusion. The fusion combines the advantages of the relative measuring sensors regarding their local precision with the capability of absolute sensors to confine the global uncertainty and thus preventing unbounded error growth.